11 min read

Navigating the Labyrinth of CES 2024: A Personal Journey through Innovation, People, and Emerging Technologies

Navigating the Labyrinth of CES 2024: A Personal Journey through Innovation, People, and Emerging Technologies

Introduction:

CES 2024 is now behind us. What are the salient memories from my personal experience? Walking. So much walking! And soreness in the back and legs from standing for hours and hours on end. It's really quite exhausting. I wore my Fitbit, a wearable health monitoring device, to measure how much I walked.

Here are my daily stats:

Tues: 11,993 steps, 5.73 miles

Wed: 27,538 steps, 13.16 miles

Thurs: 21,018 steps, 10.04 miles

Toward the end of each day, my lower back muscles were rioting against any further taxation and the best I could do to subdue their protests was to periodically sit down and give them a short rest. This pain and exhaustion were the backdrop shadowing the CES experience, but I'm sure it was a common one for attendees. This indirectly speaks to the actual scale of CES.

This was a massive event:

The whole Las Vegas Convention Center (LVCC) gets occupied by exhibitor booths. It's broken down into three large wings: North wing, West wing, and Central wing. Each wing on its own is already quite large and capable of hosting its own convention, but to occupy all three? That's a large convention. But wait, there's more! The Venetian hotel and Casino has its own convention space, so there was yet another convention floor, packed with booths. When you think you've seen it all, there's a downstairs basement with more booths, doubling the capacity! I think realistically, I probably saw twenty percent of all the exhibits available. It's impossible to see them all in the limited time frame, especially if you are going to stop and engage with the exhibitors.

There were an estimated 240,000 people attending the convention, coming from all over the world. When you pause and think about the scale of housing, feeding, and transporting an influx of that many people over a week, it makes you realize what a logistical challenge that can be. Imagine trying to eat lunch or dinner at the same time as 240,000 other people. With that kind of demand, imagine how the local prices of goods and services will adjust to take advantage of that demand. In my moment of desperation for good coffee, I paid $5.25 for a 16oz cup of brewed coffee at Starbucks! That's just a representative sampling of how insanely marked up prices were for everything.

So many amazing people:

As I was looking at the people at CES, I had an interesting thought. If Thanos wore the infinity gauntlet and snapped his fingers to make everyone at CES disappear, what impact would that have on the world? Sure, you'd have the personal tragedies of lost loved ones for friends and family, but what about the impact of the loss on global entrepreneurship and the leadership at companies? Everyone at CES seemed to be a highly influential person and a VIP in their own right, as either an engineer, inventor, business leader, visionary, journalist, investor, entrepreneur etc. The loss of these people would be a terrible setback for the world, stunting technological and economic growth, for years perhaps. Mankind would collectively lose out on the innovations, contributions and impact these people bring to the world. That put things into perspective a bit: Every random stranger was a VIP in some interesting way, and as you stop and talk to anyone, you realize everyone has a fascinating background that brought them to CES. I had a random pizza lunch with a guy who ran a small branding company in Singapore, with 20+ employees, and doing branding for small factories in Asia. I met an old guy who was large, strong and had a handshake that could crush soda cans. He was a former special forces marine with several tours in Afghanistan, so we bonded over war stories and our shared service. He was oddly enough, running a cyber security company. When people think of CES, they think of tech and product reveals, but the real show is about the people who are there.

Tech Products:

On the tech and product side, I had two competing interests. The first interest was to focus on what is relevant to our startup company. Are there any relevant products we need to be aware of? Potential competitors? Potential partnerships? If CES is a crystal ball looking into the future of the tech world, what does the crystal ball tell us? Are we future proof or are we heading in the wrong direction?

My second interest was just satisfying personal interests, curiosities, and looking at random cool stuff and thinking about it critically. What was the product? What does it do? Is it of any interest to me? What is the opportunity here? What is unique about it? What's the wow factor? Who is making the product? What's the scale of their operation? Have they gone to market? Are they struggling or thriving? What are they doing right? What are they doing wrong? What are they looking to get out of CES? Do I honestly think the product will succeed in the market? Why or why not? Will it be a mass consumer product or a niche product? Will the product/company still exist in a year?

Let's get into the salient products, experiences, and my thoughts on them:

1. First off, the big global reveal at CES was the transparent OLED wireless TV by LG. The first time I saw it, it was just a grid panel of TV screens. There was a crowd of people around it taking photos and videos, but I didn't think there was anything remarkable about it, so I kept walking. I walked by the display several times, thinking it was unremarkable and I was surprised anyone cared to take pictures of TV screens. It wasn't until I walked up from the rear of the display to see the 'behind the scenes' setup that I realized what the demo actually was showing. Behind the scenes, there was a screen that was slowly dropping to reveal the TV screen, which was entirely translucent. ...What? wait a minute. WHAT?! showing a display on a translucent surface and making it look good has always been the bugbear of AR, with generally bad solutions. A part of the problem is that rendering color on a translucent surface is near impossible, and dark colors and black are always transparent rather than black (black is the absence of color). Has LG pulled off a miracle? Yes and no. They use a huge array of micro-OLEDs which get a wireless signal. Somehow, they don't use wires and still get each LED to be powered and display an RGB value. On the engineering side, it's a technical marvel. How did they do it?! On the consumer side, why would you ever want this? When I am watching Netflix, never have I asked myself "Hmm, I wonder what the backsplash looks like right now?". It's a technological marvel, but the real-world use case reminds me of a solution looking for a problem. This was a lesson to me that even major brands can fail at the basics. Don't get blinded by building technology to the point where you lose sight of customer needs.

2. My second favorite demo was actually a boxing glove packed with sensors. I am a marine veteran and do kickboxing three days a week, so any excuse to punch things is always welcome. I usually go all out at the gym, hitting with as much force as I can physically muster. The more intensity you use, the stronger you get. If you look at me physically, I look like a skinny computer nerd who probably can't throw a punch. My arms don't appear to be big at all. But, as any fighter can tell you, that's a mistaken assessment. Power comes from your back, shoulders, and proper hip rotation -- not arm strength. I kept it all to myself as the exhibitor put on the right boxing glove. Unfortunately, the left glove wasn't working anymore.

"Oh no, if I hit at full power, would I break their right glove too?" I thought to myself. "Well, it's not a good product test if you can't test it with full force!"

I was glad it was the right glove because jabs are always weaker and my south paw crosses are nothing like my right crosses. I let a few medium crosses rip. The bag wobbled a bit, so the exhibitor put his body against it to hold it steady. Alright, go time! I threw a handful of crosses with max power. The glove has pressure sensors that can tell you how much force is behind each punch. I was racking up over 200lbs of force, well over my body weight. With each punch, the booth attendant was taking a lot of the force into his body, and saying "Ooof, oww, ugh!" with each strike. A proper test isn't complete without hooks and uppercuts, so I threw a bunch of those in for good measure. He claimed that the glove had IMU's on it which could detect the orientation and velocity of your punch to tell you what kind of punch you were throwing and if you were throwing it correctly. I threw what I knew to be hooks and uppercuts, but the device identified them incorrectly. The representative said that the glove uses machine learning to calibrate itself to the punch motions of the user, so that was why I was getting incorrect readings. But that begs the question: If machine learning is calibrating itself to my punches, how would it ever know that my punches are using bad form if it has calibrated itself to my bad form? To the attendant's relief, I ended my punches.

"Wow, you know how to put your body weight into a punch!"

The product demo was limited in both capabilities and vision, but I was inspired. What they really need to do is have a wrist band packed with IMU's which you can wrap over your glove, along with an optical tracking system of some sort and a wireless transmitter. You pair that with a mixed reality headset such as the Meta Quest 3, integrate it with OpenXR as an input device, and then you create a game in Unreal Engine which detects the location of multiple boxing bags, and then you can go through a live sparring session with multiple virtual opponents. Not only do you work on your punches, but also your ducks, slips, and guard, all without taking actual punches. That could become a real product you can sell to multiple MMA gym franchises. I was inspired.

3. The third most salient thing I saw wasn't actually at CES, but actually exclusively in Las Vegas: Driverless taxi cars. They look like the google street view camera cars, but they drive around and pick people up and drop them off -- all without an actual driver. Las Vegas is the product test environment for these vehicles. The cars are powered by computer vision and AI. I think seeing these out in the wild is the inflection point moment definitively saying "We are now living in the future. This is the moment it happened. The world as you used to remember it, will look increasingly unfamiliar from here on out."

Would I get into one of these cars? Would I trust my life to CV and AI? At least as a passenger, you have the choice to opt in. But as a pedestrian? You can't opt out of being near one. All it takes is you walking across a crosswalk at night, wearing dark clothing, and you risk getting run over by an AI driver. Yikes. Who do you hold responsible? If the AI driver didn't see you and ran you over, would it know? Would it be able to step out of the car and give you first aid? Or call a paramedic? Or would it just keep on driving because it didn't know it failed? Airline pilots will often just use autopilot to fly the plane to its destination. It's good enough that the autopilot can even land the plane. But every plane, even on autopilot, still has a flight crew. The pilots are there to override and take over for the autopilot in case non-standard conditions happen. In software engineering, you can develop a solution that handles 99% of the common use cases, but that 1% it can't handle are your "edge cases". Usually with a sufficiently complex problem, you don't know if and where your edge cases are because to find them takes a large set of input variables and a particular set of values to trigger the edge case. The driverless cars being tested in Las Vegas are going to be able to handle the 99% of common use cases, but that precise concoction of input variables to trigger that edge case -- they're being discovered live! I don't want to be a part of a driverless car scenario when an edge case is discovered. Even human drivers have accidents, and humans may even be statistically worse drivers, but you at least know that human drivers are like human pilots, capable of instantly assessing a non-standard situation and reacting safely.

4. The fourth thing that stood out for me at CES was actually the growth and advancements in 3D printing and CNC milling. I have a 3D printer myself and I've designed and printed off several objects. It's a hobby of mine. It's incredibly empowering to be able to imagine something, design it, print it, and then hold it in your hands. It's like the manifestation of your imagination. If you look carefully at the marketplace, you can even see 3D printed items commonly being sold to consumers. My girlfriend purchased an ammo loader for her pistol for $15, which just turned out to be a 3D printed plastic part. In terms of material, it probably cost $0.25 in plastic and 20 minutes to setup and print. You can effectively have on-demand production for low volume sales, which means you don't necessarily even need a warehouse to keep an inventory of merchandise on hand. PLA is a relatively hard plastic, and you can use a 0.2mm filament resolution to produce high resolution models with good tolerances. You can still get slight banding artifacts though. Resin printing isn't as common but uses a resin bath and lasers to produce meshes which are of such high resolution that you can't see any banding artifacts. Hard plastics aren't your only material anymore: Some 3D printers are capable of printing with metal by using a powdered form of the metal and annealing it at the hot end. 3D printing is an additive process done in layers, but the flip side of that is CNC milling, which is a subtractive process using drill bits to remove material. Traditional printers and mills have been constrained to the 2D plane, but some of the latest machines are able to work on an XYZ plane by using a print head with six degrees of freedom. Current conventional 3D printers are limited in the size of their print area, but I think we are at the early stages of 3D printers in terms of their capabilities. A trend I am seeing at CES is support for multi-material print heads, as well as hobbyist CNC mills capable of sub millimeter accuracy on titanium! It may be possible that some time in the distant future, 3D printers will be capable of printing out circuitry and the various electronic components. In theory, it may be possible to print out your own electronics. If that becomes possible, it's going to revolutionize the commoditization of goods and we'll see a significant disruption in the economics of consumer goods. 3D printing may be majorly a hobbyist enterprise with limited industrial use today, but I think it's worth paying very close attention to. It's in the early days of the PC and it will blow up in the next decade or two. The majority of hype these days is focused on AI and VR/AR, but mark my words, 3D fabrication will have its day soon.

5. Lastly, we should cover AR and VR. It certainly had a wide presence at CES which was hard to miss. I've been in VR since the day Facebook bought out Oculus in 2014, so I've seen the industry grow and mature over time -- in large part, thanks to the investment and commitment by Facebook (which rebranded to Meta to signify their commitment). There have been lots of VR hardware devices, each with different innovations and solutions to technical problems. First, we had the seated VR experience, then Valve went for a room scale experience and introduced handheld motion controllers, which Oculus needed to support for feature parity. Then Valve and HTC went their separate ways to make high end PCVR hardware, for which the Valve Index is probably best in class to this day. There have been a lot of peripheral VR devices to supplement the base offerings. I worked with the Leap Motion (since rebranded to UltraLeap after a merger), but the problem with all VR peripherals is that by supporting them, you also fragment your market and you rarely see an ROI for the engineering resources spent supporting them. You're effectively taking a niche market and carving a niche out of it, to create a niche within a niche -- which is generally bad business sense. Within a niche market, you always want to serve the most common denominator to maximize your market reach. I saw a bunch of product demos for VR accessories which I already know are going to fail to launch and reach market adoption -- that's not me being a pessimist, that's just being realistic about the way the market works.

There were a significant number of VR headset reveals at CES, which I would count as "Me too!" products. The Meta Quest product line dominates the North American (NA) VR market. Other VR headsets can have bigger presences in asian markets, but it will be exceedingly difficult for them to break into the NA market and European markets with notable market share. For the most part, the hardware devices can be ignored. Either they support the OpenXR API standard, and your content automatically works on their devices, or they don't support OpenXR and they'll never be compatible with the majority of VR content, and thus the hardware devices will become dust collectors and their market share will wither away. It's rare to see true innovations in "Me too!" hardware struggling to maintain parity with existing market leaders, so I didn't see anything remarkable to write home about.

In Conclusion:

If I put on my robe and wizard hat to predict the future of AR/VR, I think the next innovations to look for is a convergence of AR into VR, where VR supports AR through pass through using high resolution RGB camera sensors. Existing AR implementations for translucent display and optics are all going to fail, precisely for the technical challenges I mentioned earlier in the LG TV. Mobile VR is the future, and the main technical challenges & limitations are going to be battery life and compute power on a mobile processor. I think people also underestimate the future importance a 5G wireless signal and high precision SLAM paired with GPS will have. AR/VR headsets will be worn by people walking down the street, or by groups of people playing together in a large park in a shared virtual environment. This is the future, and it belongs to those who can envision it.

Sources:

Image source: Consumer Technology Association

Personalized Immersion: How AI is Tailoring Experiences to Individual Preferences

Personalized Immersion: How AI is Tailoring Experiences to Individual Preferences

In today's rapidly evolving technological landscape, artificial intelligence (AI) is revolutionizing how we interact with the world around us. One of...

Read More
Embracing a New Dimension of Wellness with VRenity: A Nerdle Perspective on Immersion Therapy for PTSD

Embracing a New Dimension of Wellness with VRenity: A Nerdle Perspective on Immersion Therapy for PTSD

In an era where mental wellness is more crucial than ever, innovative solutions are the beacon of hope for many. Enter VRenity, a pioneering force at...

Read More
VRenity: Revolutionizing Healing with Virtual Reality for Trauma, Aging Health, and PTSD.

VRenity: Revolutionizing Healing with Virtual Reality for Trauma, Aging Health, and PTSD.

In the realm of healthcare, innovative solutions are crucial for addressing complex challenges. VRenity stands at the forefront of this evolution,...

Read More