Most of us think of DoorDash as a company that brings you your lunch. But the real essence of the brand is about the human element and the millions of connections that happen everyday between Dashers, merchants and DoorDash users. We take it for granted but the fact that random people, from the drivers to the restaurant staff to the person ordering their food, are able to seamlessly and magically connect millions of times every day with no effort or formal coordination.
To showcase this reality in the most beautiful way, we wanted to let the data tell the story and create a powerful and artful sculpture that could live as an evergreen platform in the lobby of their new SF Headquarters. This sculptural LED wall is powered by a robust CMS that is populated with photos and videos from the brand, and leverages the DoorDash data in abstract and painterly ways, creating map based data visualizations that infuse the office with the life and dynamism of the DoorDash platform itself. We created an easy to use admin tool, similar to a Spotify playlist so that the DoorDash admin can program a schedule that cycles through various modes and color palettes that match the time of day and desired ambience of the office space. We handled concepting, design, development, technology, fabrication and installation.
Executive Producer: Adam Baskin
Director: GMUNK
Producer: Rebeca Diaz
Production Design: VT Pro Design
Creative: Michael Fullman
Project Manager: Paul Elsberg
Chief Technology Officer: Harry Souders
TouchDesigner Lead: Matt Wachter
Touch Designer: Aki Yamashita
Architect: Anass Benhachmi
Senior Creative Technologist: Colin Honigman
Creative Technologist: Dom Ricci
Production Director: Hayk Khanjian
Production Manager: Nicolas Yernazian
Content Designer: Toros Kose
Case Study Director: Andrew Curtis
Case Study Director: Aaron Marcellino
Case Study Photography: Bradley G Munkowitz
Project Overview
As a creative technology studio at the intersection of user experience and advertising, Twitch engaged us to develop specific user experiences that would allow the platform to evolve past gaming and into a platform for premium licensed content, that would prove the value for substantial media buys. One of the concepts we delivered, was around live professional sports broadcasts, and gamifying the viewer experience to create an experience that could only be possible on Twitch. With Amazon gaining the rights to broadcast NFL Thursday Night Football, Twitch was able to leverage the live broadcast andthis idea became a reality.
Project Overview
During the 2018 football season, the NFL started live-streaming Thursday Night Football on Twitch and wanted a custom Twitch extension to compliment the event that allowed viewers to compete in a live, interactive game with each other as they watched the NFL games.
Team Stats
Similar to fantasy football, each featured matchup provided an opportunity for users to compare stats in the head-to-head and use that information to score points on various outcomes in the game.
Users could access detailed team stats including average passing/rushing yards per game, average yards against, sacks per game, time of possession, and more.
Game Predictions
Users could make predictions at various points through the game, with different values to incentivize frequent interactions within a game as well as longer total engagement for viewers over the course of the season.
Prediction questions were divided into three categories (game predictions, quarterly over/unders and bonus drives) that were randomly generated, allowing for over 100,000 possible prediction combinations in any given game.
Leaderboards & Rewards
The points that users earned from correct predictions were tallied and entered into a community leaderboard where users could see how they stood up against the competition over the course of the season.
Depending on how they scored, users were placed into different tiers on the leaderboard. Each tier had corresponding rewards, such as borders, badges and other bragging rights at the end of the season.
Maharishi By You is so much more than a shoe. It is a meeting of legends — Nike and Maharishi bringing together their expertise to create not only a sneaker, but a philosophy. It is a coalition of nature and technology — using cutting-edge design to both celebrate the natural world and produce a product that honors it. And it requires an elevation of execution — infusing innovation to change how image campaigns are captured by using military camera technologies to turn an image into an immersion.
The team assembled for this project as famed Director GMUNK, design duo Toros Kose & Peiter Hergert, Dr. Joseph Picard as the DP and SoundsRed to make the magic with audio. We utilized special military grade camera technologies to present surrealist, thematic environments that the shoes exist in. What is so inspiring about these camera techniques is how they register the invisible – showing us what is normally unseen. How underneath any image is an alternate reality teeming in its vivid existence.
This is true for the vibrant colors we find in foliage within the Infrared Light Spectrum, whatever lies beneath total darkness with Night Vision, the thermal energy coursing underneath the surface and an inverted view of that spectrum in Thermal Imaging. We found a way to use these military imaging technologies to create innovative art, much as the Maharishi By You is a take on military fabrics and patterns to create enduring fashion.
These shoes are making a bold statement about our world. Where we are being fed a binary story about nature and technology, these shoes pair Nike’s AirMax 720 with the Maharishi Leopard Camo Print to create an artifact that proves that to really adapt is to marry the two. If camouflage is an “abstract rendition of nature,” then our leopard print is a co-opting of the co-opting – nature redesigned in graphic form created camo.
Camo is a symbol of the military, and yet with the Nike Maharishis we are reconnecting to our animal selves. Merging leopard print and camouflage for a new way to symbolically connect back to where camo began: nature. And Maharishi’s military inspiration also gives us a means to serve nature. The hardiness of military clothing and handcrafted utilitarianism is an antidote to the disintegrating ephemerality of fast fashion. Shoes that say so much need a very particular kind of visual campaign that can encompass all of that depth.
Because the execution brought together so many elements — from the camera technologies directly developed by the military, the art direction that immersed the shoes in nature, to the hand-retouching and intensely detailed motion design during post production— they were able to not only capture, but embody the ethos of Maharishi By You.
Adaptation is not just blending in perfectly, but immersion in the elements so much so that eventually — the elements change you and you become one with nature. So total immersion, total oneness was the goal here, within every immaculate detail.
For the print campaign, the team created multi-layered images that spoke to the multifaceted meaning of the Maharishi By You’s. the marriage of nature and technology – embracing Military seen through the lens of pacifism. It’s a military-inspired world, taking the technology honed for war, but transforming it for art and nature. So the images should feel elevated and complex, never saying just one thing.
To achieve this, the team combined photography plates to have the ultimate control in applying rich texture to create the ideal synthesis, to give an image soul through film grain and immense detailing – translating that spirit to give it that grit and timeless analog feeling.
To achieve this campaign, the team built an intricate set in-studio where they could completely control the elements. As a result, they played in a lush jungle that embraced the military and nature vibe to the utmost. The key word here was texture. The foliage, the grit, the boulders, sand, and paint, we’re talking maximum texture, layering the elements to create real depth in the frame. Objects in the foreground and background, the rocks and minerals creating a cool ground plane to complement the verdant jungle textures. Nothing perfectly evenly placed, rich with layers that were further augmented in post, while judiciously using negative space in tandem with the disparate textures for a harmonious complexity to immerse the eye.
Role: Executive Producer
Director: GMUNK
Tool of NA Line Producer: Taylor Bro
Assist. Production Supervisor: Zane Roessell
Assist. Director: Derek Jaeschke
Lead Designer & Animator: Toros Kose
UI Designer & Editor: Toros Kose, Peiter Hergert
Composer: Keith Ruggiero
Director of Photography: Joe Picard
1st AC: Nicholas Kramer
Digital Image Tech: Kyle Hoekstra
To showcase the future 5G and its impact on the NFL, Tool of North America partnered with Momentum World Wide & Verizon to create a unique interactive activation in Miami's Bayfront Park for Super Bowl LIV. The experience mixed between live action dome content and a Unity app that synced across 50 phones at once. The app held multiple features including augmented reality, volumetric replay and a toggle-able 4k multi-camera array all feeding the narrative that 5G will bring you closer to the game than ever before.
The Shoot
We secured the Miami Dolphins Hard Rock Stadium as our backdrop for this content experience. Our goal was to take guests on a ride that descended down from the heavens and placed them right in the middle of the action. Shooting on the RED Monstro with the Entinya HAL 250, we were able to Shoot in 8K with a field of view above 220 degrees. This setup allowed us to get inches away from the players, making them appear as giants, but unlike a 360 camera, eliminate all stitching issues. Adding to the sensory experience, we wanted to lean on one of the strengths of dome projections: their ability to simulate motion. To do so, we incorporated a drone rig to fly the drone in and out of the stadium, as well as hooked our camera onto a Techno-crane, allowing for smooth yet dynamic moves during gameplay.
VFX and Design Pipeline
Going into the project, we knew we needed to develop a flexible pipeline that allowed us to adjust the tilt of our final output as well as give our team a straightforward compositing and design canvas to work off of. To do so, we developed a solution that allowed us to convert the fisheye image out of the camera to 2:1 lat-long and then back again to our final 4K by 4K output. Our setup gave us full control over the degree of tilt, as well as the ability to push back and zoom into each shot with our extend 220 FOV.
With this pipeline in place, the team set out to tackle a long list of tasks, the first of which was bringing a crowd to an empty stadium. While we would have loved to have done a full 3D crowd replacement, the timeline and budget didn't allow for it. As a pivot, we implemented a stylized look, tracking lights, and particle effects to the stands, referencing the look of cell phones around the stadium. In addition to the crowds and clean-up, we implemented a series of scans that wiped over the scene, each indicating an interactive moment on the phone. We also added graphic treatments on top of our intro and outro drone scenes, flying through a countdown as you enter the stadium and flying above the stratosphere as you exit.
Mobile Experience
In addition to the projected dome content, there was an additional layer of interactivity to the experience that took place on fifty synced mobile devices in each viewer's seat. These 5G phones allowed users to engage with the dome content in multiple ways, including augmented reality wayfinding and player stats, multi-camera viewing, and 3D volumetric replays. The key challenges we aimed to tackle here from a user interaction perspective were:
When should people be using the mobile devices during the experience?
How does it relate to or inform the content on the dome at the time?
How do we make the relationship between the dome and the phones feel cohesive and natural?
Since Verizon wanted to flex multiple features of 5G during the experience that each required different device functionality, we worked with the content team toward a phone experience in stages by creating designated interaction windows that periodically occurred through the entire content piece.
During testing, we figured out how long users would need to comfortably engage with these interaction windows, accounting for both on-boarding for new, unique interactions and exploration in each stage.
Keeping this interactive relationship clear and intuitive required a multifaceted approach including:
Introductory cues and hotspots displayed on the dome projection
Overlaying UI elements and supporting instructions on the phones in sync with the dome content
Instructional language in the voiceover for the overall experience to drive it home
Tech Approach
As an integral part of the experience, we synced fifty mobile devices to our dome content while seamlessly integrating multiple real-world 5G capabilities on each device with total user control. To sync our devices, we took in an LTC encoded audio feed from the AV system and decoded the signal with a custom application in openFrameworks. The signal was passed through a custom-built node server that communicates with each device over web sockets, achieving an extremely precise and stable end output.
We developed an impressive approach to AR given an extremely challenging environment. Low light, moving images, and a warped surface are everything you don't want for AR. To compensate for this, we used multiple inputs to orient each device. Here, we used compass data and Rotation data from ARCore (3DOF) as well as a CAD model of our dome, matching each of our real-world seat positions to their corresponding location in our virtual environment.
Prototyping
We needed the AR tracking to be as precise as possible because of the challenging environment, so we built multiple prototypes in the early stages of the project to make sure our final approach was bulletproof and ensure that the tracking was on point.
One of the early considerations was using AR Marker embedded on the content, but after prototyping it, we realized that the warp and deformation of the content would interfere with the perspective of the AR content, breaking the experience for the seats closer to the screen.
3DOF vs 6DOF was the main approach to test. For the prototype, we used 3 different approaches. 6DOS + gyroscope data worked when there was static content on the projection, but the motion of "pick up your phone” and the darkness of the dome caused the content to drift a lot. 3DOS + gyroscope seemed to be the best approach, but the gyroscope data was too noisy and unstable, even after filtering this data. We ended up going a simpler route, using only 3DOF and compass data to refine the AR tracking and give a true north to the app that we could monitor and tweak from the backend.
3D models of the real-world space, each seat has a corresponding virtual position.
Compass data to give the app a true north. This way, the phone always knows where the content is
Control Backend to monitor and calibrate phones on the go between sessions, without the need of new builds.
Tool is fortunate enough to have goodboybob, a coffee shop + wine bar that is open to the public, on site at our main office. As we all social-distanced during the Spring of 2020, we wanted to satisfy a need for a few distinct things. We really missed social gatherings with our clients, we wanted to elevate the types of live virtual experiences that we were seeing, and to re-invent the dining experience. And most of all, we wanted to show brands what could be accomplished with talent in separate locations, with simple production setups and by managing a live-stream remotely.
The Upfront
We emailed our guests to partake in the intimate event, which led them to an RSVP webform to provide all relevant info - delivery address, dietary restrictions etc - enabling us to deliver a care package of amazing prepared ingredients, for a multi-course meal and wine pairing. These kits were then delivered to guests all over the Los Angeles area, on the morning of our Virtual Dinner Event.
Pre-Production
We had two physically separated production spaces, where our talent would engage the audience from; the kitchen at Tool and the wine room at goodboybob. We selected cameras, lensing, audio and simple lighting setups for each. Our sommelier, in the coffee shop, had a simple single camera (DSLR) setup, and our Chef, in the Kitchen, had a more robust setup, including three-cameras.
We tested the setup in the days prior and got the talent, our director and technical director comfortable with all aspects of the run of show, from live camera switching to engaging and responding to questions from the guests who would join via video conference.
The Production
Tool has a 10 year history of producing complex, interactive live-stream productions with highly technical production requirements, so we were excited to see what can be accomplished in a time of social distancing. We chose to use simple camera setups, talent in different locations, and leveraged a number of simple production enhancements for our execution of the virtual dinner.
Remote Mission Control
Using a live streaming software, we were able to ingest all 4 of our live camera feeds, our 4 mixed audio sources, pre-recorded edits, digital graphics, titles music playlist and other assets, to control the stream remotely. We created various setups in pre-production, mapping our detailed run of show to the broadcast, and then evolved that in real time, performing live camera switching and even making real-time changes to that run of show.
Multi-Camera Setup
To simulate some of the constraints of current productions in the time of quarantine we utilized streamlined prosumer drop kits that could be easily delivered to (non-technical) talent in any location and could be set up via a quick video conference.
Pre-Created Assets
We created a suite of graphics and videos to enhance the live stream. In the days leading up to the stream we captured beautiful content related to the food and wine for the dinner. This content was edited and played in during the live stream to give viewers background on the quality wine and ingredients, and showed them the preparation that went into the meal, (the cheese is homemade!). We also included a graphics package that had lower thirds, title cards and end cards to introduce talent, establish ingredients for viewers and create a seamless end to end experience prior to and after the broadcast.
Interactivity
In this instance we used Zoom as our video conferencing platform, although the platform used (or even a custom web interface) is a project by project decision and can be made based on the desired creative. We structured our run of show to allow our dinner guests to all see and hear each other in the upfront gathering period, and then maintain the ability to interact via voice, once our Chef and Sommelier had begun the show. It was important to us that this format was interactive and that our guests could engage with our talent in real- time to ask questions and create a social dining experience.
Results
We were able to quickly pull our interactive dinner together and our guests left full, entertained and delighted by this new way for people to come together and share in a meal. This first dinner allowed us to experiment on ways to create engaging live streams while keeping people on all sides of the camera separate and safe. From here we will continue to push on what can be done while people are still isolating, creating new avenues for meaningful connection and engagement with brands, experiences and one another.
Amazon wanted to kick off the second season of their show, The Grand Tour, with a bang by broadcasting on Twitch a social recreation of a fan-favorite episode from the first season. Sporting an explosive take on the classic kid's game, Battleship, the benchmark episode featured the former Top Gear stars separated by a wall of shipping containers dropping live explosives onto actual cars—their battleships.
Our task was to find out how to put this Mythbuster-esque fun into the hands of the fans and make it even bigger. So we created an international showdown where Twitch viewers from around the world—led by favorites streamers from their respective regions—were pit against each other in the largest game of Battleship ever!
The Challenge: How to Play
The biggest challenge of the idea was how we were going to get tens of thousands of people around the world participating in a single game being streamed from a rock quarry in Southern California.
So we developed a custom Twitch extension for the event that allowed us to divide up the users on stream into two separate teams that could all vote on which spaces they wanted to blow up by clicking an overlay on their stream.
Making an Authentic Twitch Experience
The first step of blowing things up online was a great start, but we made sure to follow through in keeping this experience true to Twitch. This involved enlisting the help of some of the most popular streamers on the platform to join in the action as well as releasing an exclusive set of Twitch emotes for chat to spam throughout the weeks surrounding the event.
Client: Amazon
Execution Type: Livestream, Custom Twitch Integration, Social Influencers
Client: The Grammys
Execution Type: Experiential Augmented Reality
Location: New York
Client: Google
Execution Type: Voice Experience
Event: SXSW @ Google Fun House
Client: Pandora
Execution Type: WebGL & Generative Music
Event: SXSW
Client: Oculus
Execution Type: VR, Original IP
Client: Nature Valley
Execution Type: 360 Capture, Interactive Website
Locations: Great Smokey Mountains, Yellowstone, Grand Canyon, Sequoia & Kings Canyon