D'Arcy Norman, PhD

Ideas

Must-play examples of great video game design?

The plan for my PhD is taking a bit of a different tack, to take advantage of an incredible opportunity that will remain cryptically-alluded-to for now. I need to go deep on video game design, and I’ll be approaching things from a teachy-learny perspective so ideally I need to spend some quality time with key video games that are exemplars of experiential learning. I’m thinking it doesn’t need to be full-on Oregon Trail you-have-died-of-dysentery, but should include games that pioneered approaches to teach in some way. Things like the deceleration curve path in Forza Motorsport 5 et al. that guides you through difficult turns on a track, or the time-rewind-retry thing in Braid that lets you iterate on a plan until you solve it, or the try-stuff-until-you-figure-it-out exploration of Portal.

Read More

Remixed Reality: Manipulating Space and Time in Augmented Reality

Thanks to Ehud for the link to this video and paper from CHI 2018. This is almost exactly the tech I built in my head to use for my PhD research. Last year, this was completely impossible. Then, kind of possible but kludgey. Now, it’s basically there.

With this, you could volumetrically record a room, and play it back from various spatial and temporal vantage points. Check out the 2:40 point where the viewer and recorded-person separate, and then the recording is played back from a different time…

Read More

volumetric video of a (jazz) performance

For my PhD research, I've been bouncing ideas around for how to volumetrically capture a performance or classroom session in 3D, and then layer on additional contextual data (interactions between participants, connections, info from dramaturgy, info from SoTL, etc.).

This NEBULA experimental jazz video by Marcin Nowrotek kind of gets at some of what's in my head. Imagine this, showing a group of students collaborating in an active learning session, and instead of notes/percussion visualizations, some kind of representation of how they are interacting etc… Also, since it's all in 3D, imagine being able to interact with the recording in 3D using fancy goggles.

Read More

Thoughts on immersive capture

I’m still not sure how to fully describe what I’m trying to do. At the most basic level, I want to find ways to apply technologies and practices to support and enhance reflection by people as they learn the craft of teaching. That’s what prompted the Nao robot study, and the various types of media (text, video, cartoon video, audio, synthetic audio…).

In a perfect world, what would this look like? I imagine capturing a teaching and learning session (a classroom session, a field trip, a laboratory activity…) volumetrically. The shape of the spaces. The shapes of the participants. The flow of participants throughout the session. The content on various displays and devices as used during the session. The video/texturemap and audio of the session. To capture everything. Multimodal, multisensory, volumetric capture of an event.

Read More

Initial “mini-lesson” media

I recorded 3 mini-lessons as part of the HRI course study – myself, Ahmed, and Sowmya. Here are some of the sample media from Ahmed’s session.

[][1]
Three versions of media to review by the instructor: high definition video (left), edge-detection “synthetic” video (centre), and Nao robot performance (right)
[][2]
Choregraphe interface with sequenced poses, gestures and robot simulator.

Videos: (password: cmd)

Read More

Volumetric Capture

This process is interesting, but WAAAAAY too intrusive to capture a session or performance without drastically altering it in the process.

A startup imaging company called 8i has been publicly crowing about its volumetric capture system for over a year, and it has used festivals like Sundance, film properties like Mad Max, and stars like Jon Hamm to excite VR newbies. Put this headset on and look-ee-here, you’re really next to other people! Neat.

Read More

Robocomic – HRI exploration of performance by a robot

OK – this may be a bit out there, but I’ve got an idea for the HRI project, as a way to explore emotional context and connection with an audience.

J. Alex Boyd wrote a piece for McSweeney’s back in 2006, “Jokes made by robots, for robots.” Some profoundly unfunny jokes, unless they’re performed by a robot.

I’m thinking – I can have a human performer read the jokes as a stand-up comedy act, recording the performance in video. I can then use the video as source material (for movements, positions, gestures, timing), and reproduce the performance with a Nao robot (using the text-to-speech for the actual robot voice). I’m picturing the performance as being somewhat deadpan – in the genre of Steven Wright’s act.

Read More

early thoughts on a framework

thinking about a spectrum of physically, triggered thoughts more like a web with interconnections, which turned into a multidimensional space with axes defining aspects of the framework and performance modalities positioned in 3D (or n-D) with interconnections…

Read More

Refining the research plan

Had a really great lunch discussion with Ehud today, and we went deep on what the overall dissertation research plan could be, and how the HRI course can serve those goals.

Coles Notes version – I’m going all-in on the “record a learning experience in 3D+ and explore playback in various modalities”. I think this is incredibly important. There is a lot of work going on, looking at motion capture and performance. There is a lot of work going on, looking at playback (especially in games and movies), but there is a huge opportunity to explore recording a learning experience (active learning classroom scenario) and allowing participants and observers to revisit in various modalities.

Read More

3D session recording and playback ideas

Connecting some ideas and prototypes…

Imagine, being able to capture a session in 3D (well, 4D), like this:

with maybe some 360˚ video, like this: (google chrome plays the video in immersive mode)

mix it with data from other sensors, audio, video, photos, media…

and play it back like this:

(3D playback was prototyped in Unity, using prototype primitives and stock models – but imagine that with fully modelled, lit, textured environments)

Read More