Adobe, the one responsible for giving people the power to manipulate pictures (among many other creative things), has taken it to the next level- with an audio ‘Photoshop’! Called project #VoCo (Voice Conversion), it can essentially make anyone say anything so long as you have a clear audio recording of said person. You’re now able to literally type words into people’s mouths. Quite an amazing (and scary) innovation.
Here’s the product introduction at the Adobe Max 2016 conference:
We have already revolutionised photo editing. Now it’s time for us to do the audio stuff -Adobe’s Zeyu Jin
The video starts off slow to introduce the concept to the non-savvy audience (as waveforms can be intimidating to some) along with some minor editing you can already do with Adobe Audition. The real voice manipulation fun starts past the 3:00 mark! If this pushes through as a commercial product, its impact in the multimedia industry worldwide would be huge.
Jin noted that about 20 minutes of a person’s speech is needed for the process to work. Dr. Eddy Borges Rey, a lecturer in media and technology at the University of Stirling, was horrified by this new piece of technology:
It seems that Adobe’s programmers were swept along with the excitement of creating something as innovative as a voice manipulator, and ignored the ethical dilemmas brought up by its potential misuse. Inadvertently, in its quest to create software to manipulate digital media, Adobe has [already] drastically changed the way we engage with evidential material such as photographs. This makes it hard for lawyers, journalists, and other professionals who use digital media as evidence. In the same way that Adobe’s Photoshop has faced legal backlash after the continued misuse of the application by advertisers, Voco, if released commercially, will follow its predecessor with similar consequences.
Indeed, if this software is commercialized, banks and businesses that use voice print checks as security measures would be at risk. But Dr. Steven Murdoch, a cybersecurity researcher from University College London, mentions that they have anticipated this kind of software for some time now:
The technology is new but its underlying principles have been understood for some time. Biometric companies say their products would not be tricked by this, because the things they are looking for are not the same things that humans look for when identifying people. But the only way to find out is to test them, and it will be some time before we know the answer.
The burden of innovation: as we further push technology to the next level, there will always be people who will utilize it for both bad and good. The good people will just have to innovate more to offset the possible negative repercussions by the bad ones.