Our team was contracted by a major telecom and sports broadcaster who wanted our help in identifying opportunities that were emerging at the intersection of 5g connectivity and immersive video technologies. The idea was for these opportunities to be explored with in-market products that would be sandboxed from the core app and brand. It would involve a newly formed product team working on a new mobile application. They had a very short deadline of 4 weeks, and we had a team of just myself and another consultant. Our objective was to identify a strategy for them on where to start their experimentation.
We made an aggressive 4-week plan to take us through divergent and convergent sprints. With activities of user research, opportunity framing, ideation, evaluation and concept development.
In our first week, my partner and I conducted extensive stakeholder interviews and reviewed the existing user research our client provided us. We did this to help us get a better understanding of their key objectives, and if there are any existing efforts or established partnerships that we should know about.
To supplement our review of existing user research we also conducted interviews with colleagues who were sports fans and fit our audience profiles.
- 12 stakeholder interviews
- literature review
- customer profile development
- technical audit
- competitive analysis
In this first week we conducted 8 interviews with sports fans, we synthesized our user needs insights into two customer profiles. Doing this gave us a snapshot of our audience's core jobs, pains and gains. We have two because our client has identified two different customer segments that they’d like to design for. Creating these profiles helped us to understand the similarities and differences between their needs.
In order to familiarize me with our client's existing sports entertainment experiences, I benchmarked their current mobile application, streaming products, and sports programming. ‘Benchmarking sports programming’ means that I watched their broadcast of a hockey game, which was a new experience for me because I’m not really that interested in the sport. By immersing myself during our first week into hockey culture and spectatorship I was able to better empathize with hockey fans and gain a deeper understanding of their sports entertainment needs.
Major shifts in technology like the move to 5g can sometimes make it harder to spot great opportunities through the noise and uncertainty. I’ve always found though that by gaining a deeper understanding of the technologies at play, I’m better able to spot opportunities and identify patterns within these emerging and intersecting technology spaces.
During our stakeholder interviews, I also noticed a wide range of knowledge levels regarding the more nuanced difference between various immersive video technologies. It seemed like this was creating some of the misalignment and a sense of uncertainty. In addition to my own understanding, I also wanted my technology exploration to contribute towards a more established baseline in our 2 planned workshops co-creation workshops. My plan was to establish a quick baseline to help our ideation to be grounded in the possibilities and constraints and help us to think on both near horizons and farther horizons as well.
Technology audit & competitive analysis
I started by doing a scan of the technology landscape forming around known applications of 5g connectivity and documented which ones would likely be relevant in the sports entertainment industry.
Established use cases for 5g are typically bandwidth-intensive and/or require very low latency connections and there are actually many new technologies in the immersive video and entertainment domains.
One of the business objectives of this project is to find use cases for 5g technology that will drive awareness and adoption. Immersive video technologies can be highly marketable when they offer an element of spectacle or novelty. The marketing team that we interviewed described this spectacle as the ‘wow factor’.
I’ve worked with some of these immersive video formats in the past, so I was already familiar with the current generation of technology. What was new to me was the next-generation technology known as volumetric video. These are video formats that can be viewed with six degrees of freedom of movement, a video you can walk through. This allows the viewer to see from any perspective, rather than from that single point, which is where 180 and 360 videos are limited.
To record volumetric video large arrays of dozens of cameras are typically used, and upfront investment can be upwards of 10 million dollars. In addition, there are also heavy processing and data infrastructure investments that need to be made to actually combine, process and stream the video. We’re in a transitionary period where immersive video technology is rapidly developing and at the same time, the 5g networks required to carry it are being built. We’re moving from single cameras to integrated systems that are chaining dozens of cameras. The investment costs are really significant, and this difference helped me realize the importance of thinking about this project on a continuum, and both on near and farther term horizons.
After I was able to actually experience this next generation of immersive video for myself the difference was immediately evident to me. I was able to try a demo that was published by the research team at Google where I was able to watch some light field video recordings. Lightfields are a type of volumetric video.
In my home office, I have a VR setup, in the screen capture on the right, you can see what I experienced. It felt last as though I was standing in a room right in front of these two people, I could move left to right and shift my perspective. It was much more immersive and natural to view compared to the current generation stereoscopic video formats that I’ve used. Here is a link to the SIGGRAPH paper on the research if you’d like to read more.
Horizon 1 & Horizon 3
During this first week of stakeholder meetings, we learned that they already had a technology partner, and so we set up interviews with that partner and learned about the near-term capabilities of their hardware and software.
They were using single-point super high-resolution cameras (50+ megapixels) with 180° field of view lenses and had the software capability to use AI to automate cropping of that video in real-time, this allowed them to create a multitude of virtual cameras that can be streamed to a cloud infrastructure that was being built. The video could also be projected onto a hemisphere and viewed in VR, and cameras could be paired to create a 360° video as well. I would generally consider this technology as part of the ‘current generation’ of immersive video. The video is not stereo and has no depth data. It’s still very capable and has some interesting software capabilities.
While not quite as exciting as next-generation volumetric video, it’s helpful to be able to start experimenting right away with a known quantity, and it’s an important constraint for us to understand regardless. Our stakeholders also expressed interest in farther horizon concepts as well, and are open to additional technology partners. This led me to really focus on identifying what were the near-term technology blocks, and ensure we were aligned there, and then in a broader scan identify which technologies are farther horizon, but would be important to plan for because of the development time and investments required.
During our second week, we had planned two ideation workshops. The first would be with our stakeholders and the second would be with our hockey-fan research participants. For the workshops, I wanted to continue using Miro which we’d been using throughout our project planning and immersion research. It’s a highly collaborative online whiteboarding tool and great for remote teams.
- workshop design
- 2 ideation workshops
- 6 evaluation criteria interviews
- design of evaluation rubric
- affinity mapping & ideation synthesis
For our first workshop, I was interested in designing a creative matrix exercise. It works well for remote teams, and for the number of participants, we were expecting.
For a creative matrix, you will need two variables that run along an x and y-axis. I’ve chosen to use how-might-we prompts, and context (or placeona). I am also stimulating the matrix with two timeline horizons. With a ‘time-machine’ at the end where for the last 10 minutes we focus on concepts for investment and development heavy ‘horizon 3’ technologies.
To structure our ideation I crafted some ideation prompts that would help our team ground our ideas in our user's highest priority needs. I used our two customer profiles to leverage our user's highest priority jobs, pains and gains and articulate these needs as opportunities using ‘How Might We’ questions.
How might we connect next gen fans and provide more social viewing experiences?
How might we connect next gen fans to the players and personalities they follow?
How might we help sports buffs escape to a distraction-free sports environment?
How might we provide sports buffs with a sense of presence and new perspectives on the game?
I needed another parameter for our creative matrix exercise. Your variables for this activity might differ depending on the project, and defining which inputs are best for your project is important.
In our stakeholder interviews, we learned they were interested in both the at-home and in-venue experiences, and after thinking about these contexts, I introduced to them the idea of an on-the-go experience, which might be during someone's commute, or while visiting a landmark within a city. This actually was an important context for us to consider because it was the only of the three that was certain to be requiring a 5g connection, whereas at home or at a venue, a wifi connection is generally available.
But beyond just the location, I wanted to think deeper about context and define placeonas–a term coined by Bill Buxton to describe a person in a place (place + persona). The placeona helps us to ground the user experience, and can determine if they have access to a second screen, or if they are seated, and have available attention etc. While introducing our ideation session we spent some time with our group to talk about the needs of users in these different contexts.
Ideation workshop 1 - Creative Matrix
These variables are the inputs into the creative matrix activity that I designed and can then be combined freely by the participants during the workshop. On the y-axis, I have the three placeonas, and the x-axis has our ‘how might we’ ideation prompts.
I split our ideation into 2 steps, both using the same creative matrix, but one with near-term technologies in mind, and a second matrix where new technologies were introduced. This kind of ‘time machine’ allowed us to spend some time thinking about the same opportunities but along a timeline of both a near term ‘Horizon 1’ and farther term ‘Horizon 3’. The board was reset between each so we could keep our output organized.
- 180° & 360° immersive video
- Mobile augmented reality
- AI auto-cropping
- Volumetric video
- Headset based AR & VR
- Web3 & the metaverse
This ended up being a very helpful way to allow us as a group to think both in near and farther horizons. In our first week of interviews, we could see a bit of misalignment regarding various existing ideas and efforts, which seemed to be due to differences in goals, timelines and expectations regarding the technology. By grounding our concept generation in the right set of user-centric variables, we were able to fully embrace the group's enthusiastic divergent thinking, while remaining relevant to our user's needs, and available technologies.
Affinity mapping & synthesis
There was a lot of great output from our first workshop and I worked with my partner to do some clustering and affinity mapping to see common ideas and themes. After this, I worked to further synthesize the output in preparation for a second workshop. At this point, the ideas were a bit shallow, more like seeds, and we needed to sprout them into more complex seedlings. I spent some time taking our affinity map and synthesizing it into a series of concept cards that we could use in our next ideation workshop, where we would mature them further.
Ideation workshop 2 - Yes, and... Combinations
To build off of and develop our ideas further I designed a new activity for us that I haven’t used before. I called a “Yes and” brainstorm, but it could have also been called a ‘remix’ workshop. It was an hour-long session with our initial participants from our user research–six of the hockey fans who fit our ‘sports buff’ and ‘next gen’ fan segments.
I first ran our participants through all the existing ideas, which I split into two categories, ‘5g enabled experiences’ and ‘interactions’. I did this because I had noticed in our synthesis that we had some ideas that were interesting, and potentially desirable, but did not require 5g connectivity; an important evaluation metric for us. They were still valid though and it was apparent that they could be combined with 5g enabled experience card in order to create a more desirable user experience.
I spent the first 30 minutes with our participants going through the 23 concept cards one by one, taking time taking time in between to discuss each of them together and answer questions. We then broke into two groups of 4 and spent 30 minutes working within our breakout groups to ideate collaboratively. I let them know that they could mix and match any of the cards but should aim to have at least one 5g enabled experience card in their combo. Bringing them together and then building on that combination with their own notes and ideas.
In our third week, we needed to mature the output from our ideation workshops and showcase them as developed ideas for further evaluation as we converge on our best concepts and recommendations. In parallel to this concept maturation, we worked with our stakeholders to help us establish our evaluation criteria, and then designed an evaluation rubric to help us isolate the best concepts to move forward with.
- 10 matured concepts
- concept museum design
- designing bullseye workshop
- 6 evaluation criteria interviews
- design of evaluation rubric
- concept showcase presentation and workshop and group voting
Evaluation & prioritization
My partner and I worked together to create a small activity we could use in a series of short 1:1 workshops we would be running with 6 of our stakeholders. It’s a basic bullseye diagram where we worked with them to better understand the most important criteria for us to consider in our evaluation.
We had them focus on criteria that considered risks of desirability, viability, feasibility, and usability. We then asked them to help us rank these as primary, secondary or tertiary. My partner used a summation of this ranking to create a rubric for evaluation.
While my partner was working on our evaluation rubric, I was spending time taking the output from our ideation week and maturing it into more resolved concepts. This included developing the value proposition, a product description, relevant placeona, and illustrations that helped to bring the concept to life. I did this for 14 concepts and we ended up showcasing 10 of them.
Concept showcase & feedback session
At the end of the week, we had a presentation with our stakeholders of all our concepts, which were shared as a ‘concept museum’ in Miro that our stakeholders had been able to preview the day before.
I took everyone through each of the 10 concepts, taking a break after each one to discuss the ideas and answer any questions. This was followed by a dot voting session where everyone was able to anonymously vote for their favourite ideas and finish with some time for final thoughts.
After our weeks of research, ideation and evaluation, we were able to converge on our final concepts. In this final week, our focus was on refining our concepts and bringing them to life with branding, mockups, storyboards, and a product vision video.
- Concept development
- UX/UI design
- VR prototyping
- Concept video editing
After our final shareout and feedback session we aligned with our product owner on the final concepts and our recommendations. We learned in our technical research and stakeholder interviews the importance of thinking about product experiments in scope for the near term, but also where the highly valuable opportunities are for them to invest in on farther horizons. Their initial interest was in many separate projects, but we began to see a strategy of thinking about them as a sequence. We outlined a product strategy that takes into account the immediate plays RSM can make today, as well as more long-term plays. We’ll be presenting 3 concepts that can all live on the platform. Each concept will be exemplified with journey storyboards, technology examples as well as desired outcome metrics and key hypotheses for further testing.
Location-based, 5G-enabled AR scavenger hunts
- Create awareness and recruit users
- Showcase 5G capabilities
- Generate excitement
Modern mobile interactions applied to sports
- Create lasting engagement
- Showcase 5G capabilities
- Increase retention via great UX
A personalized multi-screen sports entertainment suite
- Bring SportsnetX to AR/VR in the home
- Delivering a ‘behind the glass’ experience
- Hedge against cord-cutting and streaming
The first public introduction to SportsnetX and an interactive way to engage and recruit users via 5G-enabled hockey scavenger hunts.
A location-based scavenger hunt
An experience that promotes discovery, engagement, and sharing, and that spans across the city or province. AR hunters are tasked with locating experiences around their region, are given the opportunity to win tickets to the big game.
- The 180° hemispherical video from c360 is well suited for a portal AR experience
- 360° spherical video can be leveraged to create portals that users can walk through, and look around.
- Volumetric videos of players or onscreen personalities can give users a unique photo op to share on social media.
Brian walks home from work one day when he notices a new ad from Sportsnet
He scans the QR code using his iPhone to unlock the experience, a web-based AR experience and ‘App Clip’ allows him to jump right in, without needing to install anything.
After scanning, a portal is opened to exclusive hockey content via an AR-powered window
To continue the hunt, and in pursuit of winning game 5 tickets, he taps the CTA to download the SNX application.
After playing the AR Hunt in a few locations, Brain wins tickets to the next Canucks game!
Following modern mobile interaction patterns, SNX Reels is a natural way for many Next Gen fans to experience sports in a new, engaging manner.
Vertical video on mobile has gained widespread adoption along with the interaction patterns using swipe gestures associated with applications like TikTok, Snapchat, Youtube and Facebook Reels. Users can flip through different player cams by navigating up and down on their devices. Left and right could navigate between c360 cams, teams, fan stands or perhaps something else?
Tilt & look around
Both hemispherical 180° and spherical 360° video are particularly well suited for immersive video playback using the gyroscope and hand motion to inform a 9:16 cropping. This also allows us to leverage the aforementioned gesture input without needing to tap and drag to pan the video.
Hemispherical 180° and spherical 360° video is particularly well suited for immersive video playback
Live c360 footage
Using one-handed gestures of up and down swipes, users can flip between multiple games and camera perspectives. Camera cropping control can leverage be toggled between gyroscopic input or AI puck and player tracking.
Curated clips of vertical video in a chronological stories format, personalized to users by aggregating content from the teams and players that they follow.
Near-instant replays using c360 footage of major events for the teams you follow. A benefit to users at home, but also on-the-go or in-venue.
User Journey - In venue
Tony leaves his seat during a Canucks game to pick up some snacks
While waiting in line he’s able to keep an eye on the game with the SNX Reels application.
On his way back to find his seat something happens that he misses. The crowd goes wild.
On his way back to find his seat something happens that he misses. The crowd goes wild.
User Journey - At home
Marie is watching the Leafs game at home and uses the SNX Reels application as a second screen for live cameras, highlights, and instant replays.
She flips between multiple angles provided by six c360 cameras, giving her front row access to the game. She’s also able to keep an eye on other games for the teams that she follows.
After a contested goal she receives an instant replay and is able to quickly add her own commentary, and share it as a link with a friend.
Marie shares the instant replay with her friend Marcus.
A personalized multi-screen sports immersion suite where you can connect with friends and get a front-row, behind the glass seat.
Moving beyond the television
In this concept, we’re considering a fully immersive sports viewing experience and I thought it would be prudent to talk a bit about where adoption is today and how it's progressing. We considered Mobile AR-based experiments that could be run in the living room today, but with the rapid growth of VR headsets, and the emergent market for lightweight passthrough AR headsets, it’s also clear that there is a pathway to new living room experiences that go beyond the television screen, and into the metaverse.
In my explorations of this concept, I decided to do a little bit of prototyping and use a tool called Spoke to create a small demo of what the SNX Suite might look like in the metaverse. I was able to invite my colleagues into this space and I created a short trailer for the experience to share with our stakeholders. I used actual c360 footage of a Calgary Flames game that had been shared on Youtube by our client to create the immersive balcony scene.
Passing the downtime
In our early immersion research, we learned about how games are broadcast and about the cadence of live hockey games. There is a lot of downtime where the action is paused, and this may be for a review of a play, or while the ice is being resurfaced.
In a broadcast game, the onscreen personalities often fill this time with commentary and highlights. This creates a challenge for streaming of immersive video, one that I experienced in my benchmarking of a VR spectatorship app from Meta called Venues. The downtime needs to be filled somehow for spectators to want to stay in the experience. In SNX Suite we can explore different aspects of ‘play’ and personalization with ‘widgets’ that can be toggled on/off and tailored to the user. These allow the user a self-directed form of entertainment whenever they experience downtime during a live event.
Our stakeholders were very pleased with the outcomes of this short 4-week project and moving into 2022 will be working with Connected on these first initial experiments for their sandbox platform. It also built trust with Connected and has opened additional opportunities for them to partner together.