Get the latest tech news
Is AI’s next big leap understanding emotion? $50M for Hume says yes
Hume AI's EVI may have just set a new standard in mind-blowing human-like interactivity, intonation, and speaking qualities..
The first study included “16,000 people from the United States, China, India, South Africa, and Venezuela” and had a subset of them listen to and record “vocal bursts,” or non-word sounds like chuckles and “uh huhs” and assign them emotions for the researchers. I also recognize the possibility that this type of technology could be used for darker, more sinister and potentially damaging uses — weaponized by criminals, government agencies, hackers, militaries, paramilitaries for such purposes as interrogation, manipulation, fraud, surveillance, identity theft, and more adversarial actions. In other words, increasing or decreasing the occurrence of emotional behaviors such as laughter or anger should be an active choice of developers informed by user well-being metrics, not a lever introduced to, or discovered by, the algorithm as a means to serve a third-party objective.
Or read this on Venture Beat