Monthly drops
ABOUT

CHRISTINE PETERSON

Listen to the Existential Hope podcast
...

ScienCE

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

...

ArtisT

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Button primary
Button secondary
IMAGINE WE AGREE ON ETHICAL PROCEEDS ON AI
Anna Yelizarova
Listen to the Existential Hope podcast
Learn about this science
...

IDEA GENERATOR

Anna Yelizarova manages multiple projects within FLI and helps with operations and growth. She completed a Bachelors of Computer Science at Stanford University and a Masters in Communication. She focused her graduate research on the study of people’s behavior in virtual simulations at the Virtual Human Interaction Lab (VHIL) where she helped program and 3D model the virtual worlds for the studies. She currently spearheads FLI's Worldbuilding contest.

...

art GENERATOR

Anonymous contribution.

...

Xhope scenario

A eucatastrophe is not just something very good happening out of the blue. It's in storytelling: Everything is at the brink of collapse. Things are about to end very badly for all the characters we care deeply about. And then suddenly, there's a knight in shining armor with an army that comes and saves us. I think that's what Tolkien was referring to as a eucatastrophe. Thinking about how this storytelling tool would look in relation to AI, it would be a lot of people that are fed up with their needs not being met by society, frictions building up, and perhaps even some agreement is breached. Then the eucatastrophe is an event that addresses the needs of the people who are complaining. So the eucatastrophe would be an event where we collectively agree that enough is enough. Where we break out of this paradigm and become bigger people in a scenario where nobody would expect this to happen. Where you're pleasantly surprised and we put a stop to the machinery and agree on more ethical practices around the proceeds of AI.

DIscover more POSITIVE SCENARIOS