ADC Awards - 2021
Telly Awards - 2021
ABMN - 2021
WEBBY AWARDS - 2021
WEBBY AWARDS - 2014
CCSP - 2014
CRIARP GrandPrix - 2014
WAVE Festival - 2014
ACT Responsible Cannes Tribute - 2014
Forbes article - 2021
How to start to work with Machine Learning and Artificial Intelligence.
Terra article - 2021
AR headsets will be the next big thing and how Facebook is leading this race.
Band News story - 2019
Deaf mother hears daughter heartbeat for the first time with the help of Technology.
Jornal do Comercio - 2021
Article about A.I and deep fakes.
Portal Press - 2020
Brazilian makes virtual installations for the Burning Man VR edition.
This selection of projects are the result of several experiments, prototypes and research. But I couldn't make them without great teams and clients.
To promote HBO’s adaptation of Ta-Nehisi Coates’ acclaimed memoir, “Between the World & Me,” BUCK condensed the author’s pride, grief, and empowerment into a potent trailer.
Grounded in a textural backdrop with a minimal palette, the trailer uses a mixed-media approach reminiscent of collaged scrapbooks and protest posters to establish a tone of unrehearsed authenticity.
All photos used in this trailer were treated and animated by an Artificial Intelligence algorithm. My work was to create a fast and frictionless pipeline for artists to automate the process of simple animations and be able to focus on other important parts of the video
Machine Learning | Neural Volumes and NeRF
Verizon Media wanted an innovative way to present their annual event, but this year, in a virtual form. So we created three virtual worlds in Unreal Engine used in the first XR production of Buck.
In this project I leaded the Unreal Engine team and worked alongside producers, XR studios, designers and 3D artists to achieve this 45 minutes commercial.
Famous kid books brought into an Augmented Reality App that let far way parents closer to their kids. The app allow them to become some of your children’s favorite storybook characters as they read along to well-loved tales. Story Time’s music, animation, and immersive AR effects bring stories to life like never before. As a developer, to have a piece of software that I wrote, installed in hundreds of thousands of devices is definitely something that I’m very proud of ❤.
I participated in the production of 4 Books, two from the series Llama Llama and other two, for Otto. I also contributed to the pipeline of this project, creating tools that helped my team in long tasks, as exporting chapters or debugging scenes.
Verizon wanted to bring customers into the world of 5G and show off its benefits in a fun, playful way. We cooked up some interactive storytelling with WebAR and 8th Wall.
Deployed in Verizon stores around the US, customers got the chance to learn about 5G and how it will enhance their day to day lives by exploring a series of animated AR vignettes. We put a lot of thought into how guests would navigate the experience and how they would explore storytelling in the round — giving them the opportunity to act as both the viewer and director.
In the same year Microsoft HoloLens was launched, the Brazilian bank, Agibank hired me to a special project: They wanted to explore the idea of ‘How the internet banking will be in the future’. I worked with the bank’s innovation & research team and digital designers to test solutions for this question. We designed an AR banking app that had as features withdrawal, transfers and statements.
HoloLens was just released in the same year. There wasn’t references to see, or cases to follow. This was a challenge project. Not only because we were creating something that no one has ever made it before, but also, we had to understand how a 'normal user' would interact with the AR interface. My role was to prototype new ideas, test new interactions and to design the interfaces and its animations.
Unity 3D / MRKT / HOLOLENS 1
“The future is real-time” may sound like
a bumper sticker slogan, but it’s becoming
a simple fact in entertainment and, now,
advertising and brand marketing.
Using existing creative technology in new ways, BUCK has developed a new approach to branded content creation that harnesses a designer’s time to be used more efficiently; a way that focuses on the creative and shifts the volume challenge onto smart systems instead of human designers. The early results of new tools and prototype tests like this show the promissing power of tools like this
I worked on the system architecture and created the initial prototype to communicate UE4 to an Web Application.
During the time of the project I developed three prototypes:
Prototype 1: a custom, web-based, highvolume data visualisation tool for products • Prototype 2: a rapid-creation header
graphics generator for use in online marketing campaigns
• Prototype 3: a product beauty shot engine for online stores
UE4 | WebGL | WebRTC | WebSockets
‘To make a mother hear her daughter's heartbeat for the first time.’ This was the (very uncommon) briefing from the Brazilian agency EscalaCity+ to me.
For Mother’s day campaign, the brand Motorola, alongside the Brazilian retail shop, Colombo, planned a special ad campaign. They had recorded the ultrasound exams of a deaf mother in a video file and I created a system to translate the audio signal of the video into signals to be used on little linear actuators that were inside of a bracelet, specially made to fit in the mother’s pulse. The linear actuators vibrate much more precisely at a wide range of amplitudes and frequencies than a normal vibrating motor. The final system I developed was able to translate several forms of the sound wave into different forms of vibrations, simulating a true heartbeat.
C++ | Arduino | Fusion360
Years ago, I watched the Abstract episode about Christoph Niemann, the artist behind several New Yorker's magazine covers. He was explaining how amazing is the fact of his work doesn't need to be translated into any language. Everybody just gets it what he meant with that. This is what I feel about Instagram Filters
Companies and products are trying more and more to send the same message over many different languages and Instagram filters work as universal symbols.
Everybody knows and understands the meaning of a filter. So, to have a filter of something that I create being used for thousands of users around the world is a significant for me
I believe in the power of personal projects to show new possibilities of technologies and to express what are the things that really excite me at a specific moment in time.
Some side-projects become prototypes that are used in real projects, for real clients, some are a deeper research in a specific new technology and others are only for fun.
In 2017, Mark Zuckerberg said that A.R would replace our TVs, I agree with this statement so much that I started to wonder: 'if my TV will be replaced, what else will be converted to A.R in the future? Maybe my light switch? My kitchen equipment?'
I had to answer these questions. So I took my HoloLens and created a framework that could be connected to any object in my house and be controlled from virtual objects created in Unity.
In my first year at Buck, I was playing with Spark every day, for 7 days of the week. Always trying to explore new ways of creating new experiences with Spark AR.
A topic that fascinates me is AR HMD (Head-mounted displays). How to display content for a user that is used to a completely different experience, using smartphones, tablets or desktops? What techniques and inputs work in AR and what doesn't? There are so many questions to be answered yet.
I tried to answer some with a prototype: I created an AR sandbox app, using Microsoft Hololens. A Lego-style app, that you could connect blocks, and control AR objects, using physics or inputs from your hand or game controls. Technologies I used to make this: Unity, Hololens, MRTK
When Apple launched ARKIT3, allowing AR apps to occlude and generate segmentation of people, I started to prototype several new ideas of how to use this new technology in experiences for AR Apps. Technologies I used to make this: Unity + Xcode
Spark's Patch editor is a powerful tool for creating augmented reality filters. But if you have to create a complex scene that requires hundreds of nodes the task starts to get tricky... Of course, there's ways to work with big projects, like Blocks, Node groups, or copy and paste several nodes, but this task could take days or weeks, just to connect all these nodes again.
I've seen Google's experiment using Tensorflow and Unity and I loved it! I had to try my own version, but instead of TF and UNITY, I used MediaPipe and Unreal. MediaPipe is a Machine Learning and Computer Vision framework from Google. The inspiration was this video here: https://vimeo.com/363850538 Technologies I used to make this: Unreal Engine, MediaPipe (Computer Vision + MediaPipe) and Python
How human expressions will be translated in the metaverse? What are the technologies that will enable users to express their emotions virtually? What is the most frictionless way to capture facial data?
These are some of the questions that made me explore new ways to capture and translate facial data to digital characters.
Technologies I used to make this: Unreal Engine, ARKIT
Arboreal is a new exhibition at London's national Kids Museum, where children can interact with large-scale digital projections on the walls and floor, from posing for selfies as fireflies swirl and settle onto your body, creating a looping ambient soundtrack by triggering stalactites and fungi, stomping on giant buttons to engage with magical creatures from frogs to monsters, and propelling creatures around a giant woodland game of pinball. Attempt to sneak up and startle an array of creatures clinging to the forest fauna and discover the owner of the eerie eyeballs watching you from the bushes!