“Emojis are already ubiquitous in digital communication platforms, including text-based messaging, email and social media. They require fewer ‘clicks’ to express intent, add context, or set tone. A single emoji can dramatically alter or enhance the interpretation of a message. We think they can be leveraged as a supplemental expressive proxy for people who may not have access to speech or the muscles that drive facial expression,” Paradiso says. “Our PALS collaborators have been some of the funniest, most thoughtful and creative people I’ve known, but their expressive range can be limited by the constraints of disability as well as today’s speech devices and their underlying technologies. We know that people want to express a lot more with their AAC devices than basic transactional communication. We wanted to create something that would help people stay more actively engaged in conversations, be visible in low light and from a distance, and provide another avenue for unique expression, playfulness and connection.”

The team looked at different types of secondary displays, but they kept coming back to LED displays, for several reasons: they’re low-cost, they work reasonably well and “they have kind of a cool factor.” The team’s collaborators in PALS also made it clear they didn’t want to use anything that could have unintended negative social consequences for the users.

“Watching somebody living with ALS and being able to empower them to do something that they previously had given up all hope of being able to do is enough to inspire you to want to do more,” says Dwayne Lamb, a developer who specializes in User Experience and User Interface creation who joined the Enable Group in early 2017. “Most of the time when you’re trying to communicate with somebody who can only use their eyes to communicate, they’re using their eyes to type into a keyboard on a device right in front of them, and although it’s not socially good etiquette, you commonly find yourself kind of looking over their shoulder to try and see what they’re typing.”

Expressive Pixels evolved in part from wanting to solve that problem.

Face emoji
Emoji in the Expressive Pixels app

And with Expressive Pixels, it’s possible to create animations on displays of many sizes, up to 64 x 64 pixels, says Christopher O’Dowd, who helped fill the hardware gaps on the project. He points out that LED displays are ubiquitous at Maker Faires, on houses during the holidays, etc. LED displays are so versatile, you often even find them on fabric (face masks, caps and backpacks, for instance) or banners.

One early way the team incorporated LED displays was through hands-free music, an award-winning SXSW project.

There, they used a custom midi-enabled, music synced LED array as a supplemental visualization to an eye controlled, physical drum rig designed for one of their collaborators, a Seattle area musician living with ALS. The project, which won the 2018 SXSW Interactive Innovation Award: Music and Audio Innovation, features a suite of novel eye-controlled applications for music performance, collaboration and composition.

“How does someone without access to speech or movement compose or perform music, command a stage, or connect with a live audience? What about collaborating with other musicians in rehearsed or improvisational scenarios? How can we lower the barriers to making school music programs more inclusive without ‘othering’ or minimizing the students with disabilities? These were some of our foundational questions,” Paradiso recalled. “We wanted to adapt our technology and designs to align with a person’s creative goals and real-life scenarios, rather than the way around.”

Lamb came up with the idea to add Musical Instrument Digital Interface (MIDI) capability, with the core idea to send out different signals to different instruments – an idea that would transfer to Expressive Pixels.



Article Source

This site uses Akismet to reduce spam. Learn how your comment data is processed.