STEPHEN DOWNES: Well, here we are. Week six, looking at the final chapters. Are as you guys put down, the next battles for openness. Data, algorithms, and competency mappings.

[ MUSIC - "CHARIOTS OF FIRE"]

Well, it's kind of interesting because this is the planet Mars in the background, of course. And my name is actually on the planet Mars. It's on one of the CDs that were inscribed with the names of people who were interested in the project and put on Opportunity. And Opportunity is one of the rovers that was roving and is still roving around Mars. It's kind of neat to think about. I wonder who will own that disk in the year 2400. One of the future challenges of data ownership.

You know, we look at the future challenges as new kinds of data. At least that's the way it's presented there, right? Algorithms, data, competencies, things like that. I wonder if the new kind of challenges for openness are going to be not just the types of data, not even just the way people use the data, because people are going to use and abuse the data in a whole number of different ways. But I'm wondering if it's going to be entirely new kinds of openness.

As George Orwell first thought about the concept of a thought crime and whether we could be open with the ways that we think. Now we're going to have communication systems that allow us to communicate directly, mind to mind, with each other. It will be some subvocal communications with subcutaneous-- I don't want to say amplifier's, because that would be very loud, but speakers or micro-- or whatever-- that allow us to hear other people's thoughts. How open is that going to be? What kind of environment is that going to be? It's an interesting and odd sort of thought.

What about, also, the way we can combine and recombine data? Not simply, are we allowed to do this? But are there some combinations or recombinations that are OK and other combinations and recombinations that are not allowed? Now you might think, that makes no sense whatsoever. But if we look at something like genetic data-- let's say we took genetic data and we mixed it with the result of algorithmic data so that we created some kind of hybrid, that half machine language, half human, would that be OK?

A lot of the issues of ethics and what it means to be a person and what it means to be a society are going to be challenged by the new possibilities of creating, manipulating, and sharing new kinds of information. And I think openness is going to be challenged by these things. We're seeing that very sort of thing happen today. You know, one kind of openness was the sort of openness that was inspired by things like the Bill of Rights or charters of Rights and Freedoms, where you have freedom of speech, freedom of assembly, freedom of religion etc.

You know, those were the original free isn't freedoms, if you will. And these freedoms never envisioned a world of Facebook or a world of social media and fake news and Donald Trump and all of that. And what are we to make of that sort of world in the future where the freeness and freedoms that we have now result in really strange and unusual combinations that threaten to undermine the social order? As free as in speech is doing today.

How do we manage this? How do we control-- and I don't want to say control-- but how do we manipulate, design, manage our freedoms, including the five Rs, including freedom of access, freedom of information, and all of that, in a way that kind of future proofs us against the genetic freak kind of Donald Trump of the future? It's an interesting question. I'm going leave you with thank you for the course, guys. It's been fun. We'll see you when we see you.