Technology is moving faster than our ability to process its impact, forcing us to question trust, motivation, and the value of our time. Few people have had a closer view of those shifts than Esther Dyson. With a background in economics from Harvard, Esther built a career as a journalist, author, commentator, investor, and philanthropist, with a unique ability to spot patterns across industries and challenge assumptions before they become mainstream. She is the executive founder of Wellville, a ten-year nonprofit project dedicated to improving equitable well-being in communities across the United States. Beyond her nonprofit work, Esther has been an active angel investor in healthcare, open government, digital technology, biotechnology, and even outer space. Sheâs currently focusing on health and technology startups, especially the ones that actually care about human connection instead of just making everything faster and more efficient. When we chatted, Esther made this really compelling point about AI. She thinks we're asking the wrong question when we debate whether artificial intelligence is good or bad. What really matters, she argues, is how we choose to interact with it. We dove into some tough ethical questions about how quickly we're adopting these technologies, this concept she calls "information diabetes," and why being upfront about who's funding what and why is absolutely crucial if we want to trust anything anymore. Show Notes: [01:44] Esther describes her career path from journalism to independent investing and healthcare projects. [02:52] She explains why Wellville had a set end date and connects it to her upcoming book on time and mortality. [04:08] Esther gives her perspective on AI, tracing its evolution from expert systems to neural networks and LLMs. [06:17] She stresses the importance of asking who benefits from AI and being aware of hidden motives. [12:44] The conversation turns to ethical challenges, biased research, and the idea of âinformation diabetes.â [15:37] Esther reflects on how wealth and influence can make it difficult to get honest feedback. [18:09] She warns that AI speeds everything up, making it easier to do both good and harm. [20:14] Discussion shifts to the value of work, relationships, and finding meaning beyond efficiency. [25:45] Esther emphasizes negotiation, balance, and how ads and AI should benefit everyone involved. [27:28] She highlights areas where AI could be most beneficial, such as healthcare, education, and reducing paperwork. [29:26] Esther argues that AI companies using public data should help fund essential workers and services. [31:08] She voices skepticism of universal basic income and stresses the need for human support and connection. [34:55] Esther says AI is far from sentience and accountability lies with the humans controlling it. [36:46] She explains why AI wouldnât want to kill humans but might rely on them for energy and resources. [37:33] The discussion turns to addiction, instant gratification, and the importance of valuing time wisely. [41:02] Esther compares GDP to body weight and calls for looking deeper at its components and meaning. [42:19] She explains why she values learning from failures as much as from successful investments.
Technology is moving faster than our ability to process its impact, forcing us to question trust, motivation, and the value of our time. Few people have had a closer view of those shifts than Esther Dyson. With a background in economics from Harvard, Esther built a career as a journalist, author, commentator, investor, and philanthropist, with a unique ability to spot patterns across industries and challenge assumptions before they become mainstream. She is the executive founder of Wellville, a ...