Post New Job

Overview

  • Founded Date June 3, 2014
  • Sectors Maritime/ Transportation
  • Posted Jobs 0
  • Viewed 32
Bottom Promo

Company Description

AI Simulation Gives People a Glimpse of Their Potential Future Self

In a preliminary user research study, the scientists discovered that after interacting with Future You for about half an hour, individuals reported reduced stress and anxiety and felt a more powerful sense of connection with their future selves.

“We don’t have an actual time maker yet, however AI can be a type of virtual time device. We can use this simulation to help people believe more about the effects of the options they are making today,” says Pat Pataranutaporn, a current Media Lab doctoral graduate who is actively establishing a program to advance human-AI interaction research study at MIT, and co-lead author of a paper on Future You.

Pataranutaporn is joined on the paper by co-lead authors Kavin Winson, a scientist at KASIKORN Labs; and Peggy Yin, a Harvard University undergrad; in addition to Auttasak Lapapirojn and Pichayoot Ouppaphan of KASIKORN Labs; and senior authors Monchai Lertsutthiwong, head of AI research study at the KASIKORN Business-Technology Group; Pattie Maes, the Germeshausen Professor of Media, Arts, and Sciences and head of the Fluid Interfaces group at MIT, and Hal Hershfield, teacher of marketing, behavioral choice making, and psychology at the University of California at Los Angeles. The research study will exist at the IEEE Conference on Frontiers in Education.

A practical simulation

Studies about conceptualizing one’s future self return to a minimum of the 1960s. One early approach targeted at enhancing future self-continuity had individuals compose letters to their future selves. More just recently, researchers made use of virtual truth safety glasses to help people imagine future variations of themselves.

But none of these approaches were really interactive, limiting the effect they could have on a user.

With the arrival of generative AI and big language designs like ChatGPT, the researchers saw a chance to make a simulated future self that could discuss someone’s real goals and aspirations throughout a regular discussion.

“The system makes the simulation very reasonable. Future You is much more detailed than what an individual might develop by just envisioning their future selves,” states Maes.

Users start by addressing a series of questions about their current lives, things that are essential to them, and goals for the future.

The AI system utilizes this information to develop what the scientists call “future self memories” which supply a backstory the design pulls from when connecting with the user.

For example, the chatbot might discuss the highlights of somebody’s future profession or response questions about how the user conquered a particular challenge. This is possible due to the fact that ChatGPT has been trained on comprehensive data including people speaking about their lives, careers, and good and disappointments.

The user engages with the tool in 2 ways: through self-questioning, when they consider their life and objectives as they build their future selves, and memory, when they contemplate whether the simulation reflects who they see themselves ending up being, states Yin.

“You can envision Future You as a story search space. You have a possibility to hear how a few of your experiences, which may still be emotionally charged for you now, could be metabolized over the course of time,” she says.

To help people imagine their future selves, the system produces an age-progressed photo of the user. The chatbot is also designed to provide vivid responses using phrases like “when I was your age,” so the simulation feels more like a real future variation of the person.

The ability to listen from an older variation of oneself, instead of a generic AI, can have a more powerful positive effect on a user contemplating an unsure future, Hershfield says.

“The interactive, brilliant parts of the platform offer the user an anchor point and take something that might result in distressed rumination and make it more concrete and efficient,” he includes.

But that realism might backfire if the simulation moves in a negative instructions. To prevent this, they make sure Future You warns users that it reveals just one possible version of their future self, and they have the agency to alter their lives. Providing alternate responses to the questionnaire yields a different conversation.

“This is not a prophesy, however rather a possibility,” Pataranutaporn states.

Aiding self-development

To examine Future You, they performed a user study with 344 individuals. Some users communicated with the system for 10-30 minutes, while others either interacted with a generic chatbot or only filled out surveys.

Participants who utilized Future You were able to build a more detailed relationship with their ideal future selves, based on a statistical analysis of their responses. These users likewise reported less anxiety about the future after their interactions. In addition, Future You users said the conversation felt genuine and that their values and beliefs appeared constant in their simulated future identities.

“This work forges a new course by taking a well-established psychological strategy to visualize times to come – an avatar of the future self – with cutting edge AI. This is precisely the kind of work academics should be focusing on as innovation to build virtual self models merges with large language models,” says Jeremy Bailenson, the Thomas More Storke Professor of Communication at Stanford University, who was not involved with this research.

Building off the outcomes of this preliminary user study, the researchers continue to fine-tune the methods they establish context and prime users so they have discussions that assist build a more powerful sense of future self-continuity.

“We desire to assist the user to speak about particular topics, instead of asking their future selves who the next president will be,” Pataranutaporn states.

They are also including safeguards to avoid people from misusing the system. For circumstances, one could think of a company developing a “future you” of a potential consumer who accomplishes some great outcome in life because they acquired a specific product.

Moving on, the scientists desire to study specific applications of Future You, maybe by making it possible for people to check out different careers or visualize how their everyday options might impact climate modification.

They are also gathering data from the Future You pilot to better comprehend how people use the system.

“We don’t want individuals to end up being dependent on this tool. Rather, we hope it is a meaningful experience that helps them see themselves and the world in a different way, and helps with self-development,” Maes states.

Bottom Promo
Bottom Promo
Top Promo