Science deserves transparency and openness — on why we are tokenizing our AI for Science.

Reposted from Medium

The world needs science. Complex challenges ranging from climate change to preventive medicine require us to put our best minds together to solve them. And we do live in a world where more scientific knowledge is available to us than ever before — but the irony is that our politicians doubt its legitimacy, our researchers are pressed for time and resources do not have capacity to communicate across even the closest alleyways of a university, publishing houses generate profit by keeping vital results hidden behind heavy walls, and go after those who breach them with deadly force. In addition, the big software players are opaque and seemingly impossible to hold accountable for their data, their algorithms and the implications of these. In spite of Tim Berners-Lee creating the internet to share scientific knowledge, it seems we have only come marginally further today than we had back then.

Two years ago at we sat out on quite the ambitious journey: to build what we call an “AI Scientist”, a system that can augment our human intelligence by connecting the dots of all of the world’s research. The months since have been filled with hard work, progress, setbacks, a lot of rejections — but also so much love, support, understanding and encouragement from our wonderful community.

In these two years we’ve built a system that reduces a human’s time to map out existing scientific knowledge with up to 90% , while increasing serendipity and interdisciplinary discovery. This is the first important step towards what we call the Knowledge Validation Engine, a core feature of the AI Scientist. We have a dedicated team that has built this, we have a number of budding university collaborations and we have a group of lovely investors who believe in us and we’ve published several open access research papers. Most important, this past year we’ve seen an amazing community of AI trainers grow up around us — more than 8,000 individuals who volunteer their time to help learn. We’ve seen a desire to be part of our journey, a wish to help us achieve our mission, a community coming together to tell us that what we do is important. We have done our best to honor their help, but we have not done enough.

Transparency, openness and fighting bias have been our core values from day one, but we find ourselves not living up to our own standards. We find ourselves torn between servicing big corporate clients and satisfying a European venture capital community single-mindedly focused on revenue (with some very honest impact focused exceptions) on one hand — while also trying to bring what we build out to as many people as possible.

For us to truly make impact in the world, it is not enough to build some great tools, we need to disrupt and uproot an entire industry. We can not do that on our own — it’s a grassroots challenge. We need your help.

Scientific knowledge is arguably the ultimate decentralized application. It is not controlled by a central agent, is individual node-independent, is exposed to public scrutiny and constant challenge and is preciously valuable for a large and fast growing cohort of current and future users.

The technology development of this decade is thrilling, and today we are taking advantage of a new opportunity. Utilizing the decentralized nature of the blockchain, we have decided to give power to our community by tokenizing our functionalities— allowing anyone who contributes to the tool to generate tokens by doing so — tokens they can later use directly to access our core services.

An AI Trainer who annotates research papers, a coder who commits to our increasingly open source code, a user who reports a bug or a researcher who uses the Knowledge Validation Engine to publish their research Open Access — they will all be rewarded with tokens for contributing. The tokens can be used to access any of the tools. All token holders will have a voice, have transparent insight into our core technology and will be asked to hold us accountable to openness and de-biasing our algorithms and data. And as both corporates purchase access to the tools and the the algorithms of improve over time, the value of the tokens held by our community will increase.

We’re excited, thrilled and a little scared — as per usual, we’re traversing unchartered territory. And we can not do this alone. Please join us in making science transparent, open and accessible.

Our white paper with the details will be made available early 2018. Until then we would love your ideas, thoughts and feedback — on or our Telegram channel.

Join us!