As our interactions with machines evolve, exploring and creating new boundaries around immersive interfaces becomes critical. The Media Under Dystopia WISPer edition is a lab hands-on exhibition exploring how the internet and XR can become tools to democratize art creation, access, co-creation, and critical media expansion. With the MUD Foundation exhibition Media Under Dystopia Wisper edition, we are prototyping the future of our current program iterations with local, national, and international communities. "WISPer" is the acronym of "WISP," Wireless Internet Service Provider, and the word ‘whisper’. This exhibition is launching our WISPer program, creating a WISP network around the MUD Foundation venue (and beyond), where the local, national, and international communities can access the MUD Foundation exhibition program and community partners initiatives. At the same time, the audiences have art access and art creation tools through our metaverse XR platform: the MUD Verse.
A garden of sculptural machines and furniture for viewing a video game inspired by fusions of futuristic industrial design, pre-Columbian sculpture, architectural renderings and video game conference aesthetics. Game Viewing Gardens brings viewers into a space partially furniture showroom and partially mythological location where beings, technology and spaces are interrelated from the neo-primordial gaming universe of Castañeda’s “Levels & Bosses: Part One”. Sculpturally reinterpreting “Let’s play” scenarios where a person showcases a walk-through of a game either online or in a stage, this work imagines the scenarios as virtual performance. Game Viewing Gardens is available on the MUD verse.
Lans King has an NFC microchip implanted in his left hand. His "artist-self" is registered on the blockchain, and it is linked to all of his artworks. Using a variety of wearable devices, Lans collects physical, emotional, and cerebral biodata. His digital artworks are often derived or generated from that data. This virtual sculpture was generated by the artist using a brain-computer interface device that captures brain activity data. The data is fed into a generative algorithm and parametric software; each channel is mapped to a specific parameter. The software transforms an initial geometry (such as a sphere) into a variety of new forms that are unique and rare. Through this process, thoughts become forms. This specific work was generated by Lans's brainwave data as he focused his thoughts on the word "tomorrow." This work is a prototype for a performance/generative artworks called The Cyborg Manifesto. In 2024, Lans King will tour this series of performances during which he will generate virtual sculptures in real-time while wearing his brain-computer interface device. For this performance, he will spend 24 hours in a glass pod, connected to a computer running a generative algorithm. These digital sculptures will then be output as physical forms.
The new edition of Media Under Dystopia 3.0: WASD displays artworks created by visual artists using the internet as a creative medium. Presenting tech-centered pieces that deconstruct our digital environment and culture and at the same time exploring the open metaverse as a place for creation within artistic practices.
Microverse delves into the vibrational essence of the natural world. The instrument sounds without being touched, and the waves it emits are seen in space. Visitors can navigate this otherwise imperceptible reality thanks to a virtual reality experience.
In this artwork, the intersection of digital and cultural histories is explored through the research and reimagining of “card stunts” used in various historical contexts where individuals become only a pixel and are surrogates to the collective inertia.
New Extractivism explores the concept of a new form of extractivism in the stack behind contemporary technological systems creating a blueprint of a machine-like superstructure: a super allegory.
By utilizing religious iconography from the artist’s upbringing and fusing it with contemporary queer imagery, viewers will witness an apparition that will protect and unite all who experience it.
Interfacing Actuality has taken the notion of interfaces as a point of entry for enacting a shift currently taking place in our reality, which is demonstrated by the phenomenon of the metaverse. This shift I have termed the prototyping of actuality. Interfacing actuality refers to the metaverse as the technical system by which the actual becomes an interface.
A virtual twin of XRhub, the MUD Foundation XR Lab, and gallery space based in Little Haiti, Miami, Florida, US.
In a virtual environment, a set of drone-like beings analyze and manipulate an organic mass-like entity. Testing modes of interaction through varying intensities of light and touch, the drones train to interact with a section of a sentient world whose material properties are uncertain, shifting between liquid, solid, and gas states. Drawings relating to the process and environment accompany a wallpaper that serves as a simulated U.I. for future or alternate world technology where landscapes, living beings and machines are interchangeable and interconnected. Images of perhaps an inventory or interface around or inside the entities. A micro-scene from the expanded video-game experience Levels & Bosses.
We Are Being Watched is a computer-generated work comprising 4 screens and a live feed projection. Each screen displays one word of the aforementioned title, moving vertically on the screen, glitching like damaged VHS tapes or old TV reception. The projection shows a glitched live feed of the audience over the glass doors of the exhibition space, enabling people outside to see what is happening inside. The work interrogates the classic surveillance slogan We Are Being Watched by reframing the YOU as WE. This semantic alteration places the awareness in the people who are being watched while diminishing the power of the invisible watcher, the Orwellian Big Brother, the all-seeing eye of a totalitarian police state. In the 'we' there is a communal experience that empowers the surveilled targets, providing a sense of inclusion rather than fear and paranoia.
A music composition takes place in a virtual reality space, where the spectator - user - will be able to move around it, altering the music depending on where they are within the space. The work features video as well, this being part of the musical 'narrative'.
Pilgram: Naked Link 3.0 is the continuous process of a conceptual work that explores the boundaries between data visualization and art made from data collection, creating a link between scientific InfoVis & data sublimation. Pilgram 1.0, the first iteration, was made in 2015 and opened a hotspot in Havana. The 2.0 iteration showed the invisible structures that tie the U.S. (Miami) and Cuba, monitoring and analyzing the packet data traffic. The 3.0 naked links iteration will be a search for the possible changes (or not) of communication infrastructure since the 2.0 research. For the 3.0 iteration, we will display (in real-time) how the flow of communication between the two cities and the two countries is taking place nowadays. Pilgram: Naked Link 3.0 will visualize on real-time XR interfaces (extended realities) the links and infrastructure between the two countries that, in other cases, wouldn't be visible to the naked eye.
Seeming to float above Miami’s Biscayne Bay shallow seagrass beds, Stiltsville has a colorful history that dates back to the 1930s, when “Crawfish Eddie Walker” built the first shack on stilts above the water. Told in the style of a time travelling journal, with puzzles solved, games played, images captured and clues discovered, the Oculus Quest experience, will guide us through the houses, their history, their underwater sea life colonies and introduce us to their colorful caretakers.
A 3D immersive world made from young people’s visions for a more sustainable and socially responsible future and we need to build it! Dream City will come to life using immersive technologies (WebXR, AR, VR, MR) and is available on phones, PC, and headset to provide experiences that extend human perception and interactions between the real and digital worlds.
Led by the M4 team in collaboration with the MUD Foundation, the Manchay Project is a transformative initiative for the San Francisco de Asís School in Manchay. Addressing educational gaps in literacy and math, it introduces the metaverse as an avant-garde learning tool. Over three years, the initiative will establish a Wi-Fi-equipped garden library, debut a metaverse learning platform, and ensure the best utilization of resources. This effort is set to pioneer a benchmark for educational strategies in developing areas.