Mixed Reality State as metaphor

Back in 2007, Hübler and Gintautas published a paper (also in 2008) on an experiment that I think is fascinating: they coupled a real pendulum to a virtual pendulum and created what they called a mixed reality state.

You probably know what a real pendulum is and can imagine two of them connected somehow by a spring or a string in such a way that the two motions affect each other. Ok, here is help: despite Wikipedia, do not confuse two coupled pendulums

with a double pendulum

Now imagine simulating a pendulum on a computer using Newtonian mechanics and animating it subject to some initial conditions and/or a driving force. What Hübler and Gintautas did was physically couple the two types of pendulum, so that the virtual pendulum receives digital feedback about the position of the real one and the real pendulum received physical feedback depending on the position of the virtual one. This is not a diagram of what they did, but merely an artist’s conception:

Far as I can tell, this work has not triggered a cascade of related research, but I like the idea as a metaphor for use of educational technology and even more broadly for the integration of technology in many aspects of our lives.

Blended learning environments refer to the combined use of online and mobile computer technology with hands-on, peer-to-peer and instructor-guided classwork. Ideally these modes are not independent but coupled. The student receives information and feedback from both physical and virtual learning environments, and both environments respond to the student’s changes. A mixed reality state.

Some critics have imagined online learning as a sad and lonely substitute for social learning in groups. But online social networks and discussion groups suggest other possibilities; in fact they are becoming the subject of intense research on human dynamics. Most of us are responding to feedback from physical and virtual sources everyday through actions we take in the physical and virtual world. Forget the yellow submarine; we all live in a mixed reality state.

 

Apple’s disappointingly low-interaction iBooks (pending custom widgets, activity logging)

Apple held a much-anticipated education event today in NYC, and I’m mostly disappointed though not without hope. If I were really cynical, I’d say they’ve made a deal with the big publishers that will benefit both of their businesses without really benefitting the end users. But I’m not that cynical. My gripe is with the potential for interactive engagement (buzzword alert) that might be lost on this new etextbook platform depending on the mysterious side door that is the custom widget. What interactive engagement means in the context of ebooks is perhaps unclear, so this is where I will start.

The Meaning of Interactive

To compare whether Apple’s new platform for electronic textbooks is really revolutionary on the interactivity front, we need to remember that the copy of Physics by Halliday & Resnick (3rd Ed.) on my bookshelf is not exactly a hunk of basalt. It’s at least interactive in the following way: it has cover art, lots of different pages, and I can freely flip through them in any order I want to. And it has pictures, or at least diagrams, which convey things to me that are hard to do with text alone. Some of the text is bold or highlighted in some way, and I own the book so I’m free to write notes in it if I want to.

My iPad is way cooler than my copy of Halliday & Resnick; the color is nice and bright and I can have thousands of textbooks stored in it. But if all those textbooks do is flip pages, show pictures and take my notes, I would not say the interactivity has been substantially upgraded.

The first level of interaction is being able to see content that is richly dynamic and to change that content, albeit in a limited way. Watching videos and animations, zooming in and out of or rotating an image, turning on and off parts of a diagram or switching between multiple display modes–all of these are interactivity level one. It’s nice that these are part of the iBook experience, but it’s really the least you can do.

Interactivity 2.0 means being able to change things in a meaningful way, like moving a bunch of objects around and seeing what happens to their state as a result or writing in your own equation and seeing it graphed as a mathematical curve. Interactive web graphics, apps and games have multiple knobs and gesture-based controls with exciting outcomes that can keep you busy for several minutes or hours, not ten seconds flipping through mode 1 through 5.

Here’s an example taken from life not from the page: Imagine you moved into my house and I showed you to your new room and said, look you can do whatever you want in here, ahem, you can view your room from multiple angles, you can lie down in that bed there, and you can turn the lights on and off. That’s interactivity level 1 and not really all that generous. It’s a far cry from saying you can replace the carpet, move the furniture in, out and around, play music, cover the walls with faux fur, install your own disco lighting and change the window treatments. That’s Interactivity.

Of course the two levels I just sketched are arbitrary demarcations, because interactivity is really measured on a continuum that is a function of how rich a feedback experience you get from the interaction. And it seems to me that Apple’s current offering is rather limited.

I took a peek at the free sample of the only physics textbook in the new store. It’s a digital version of Glencoe Physics: Principles and Problems by Zitzewitz et al., which is actually a pretty good textbook. I even ordered it for my ninth grade class when I was teaching full time. The free sample chapter is Vibrations and Waves, and it is full of pseudo-interactive figures like one on pendulum motion, which is really a three-stage figure with some transitional frames to “animate” it. Can you pull the pendulum and release it? No. Can you change the mass or length of rope or constant of gravity to see what effect each has on the period of oscillation? No. In fact this figure, like all of the interactive figures, is more frustrating than informative because your natural tendency is to try to make things happen that you just can’t. (If after reading this, you’re dying to play with a virtual pendulum, you can use the truly interactive PhET simulation).

I don’t believe this is just a shoddy job on McGraw-Hill’s part. The iBooks Author application imports interactivity through the following set of widgets: Gallery (multiple images you can flip through), Media (video), Review (multiple-choice-type questions), Keynote (for graphs and data, presumably), 3D (image is rotatable with the touch interface) and Interactive Image. Interactive Image means you can select parts of the image which, when clicked on, will zoom the view and maybe show a bubble of text.

You’re not going to get past level one interactivity with them tools. But here’s the rub. Supposedly you can author your own widgets with javascript. I will need to see more, but this gives me hope for the next level interactivity, because I’ve seen wonderfully interactive web graphics using AJAX/javascript/jQuery. If iBooks Author can really incorporate that functionality, I may become a believer yet.

Not interactivity, but the next big thing in education

One feature that I doubt will be possible with iBooks2 is activity logging, and that’s a shame. Activity logging is one way that educational web services learn what students really do given the opportunity, how much time they spend with different features, and—in the best possible integration of learning resources with formative assessment—how that interaction has affected their learning. This information is used to keep improving the resources in a feedback loop that makes things better for everyone and faster than the typical revision cycle. And the data can also be useful to teachers, who can subsequently supplement the offerings in ways the textbook cannot. Some people get privacy concerns about activity logging, but it can be done safely and anonymously. If you think of those bug reports your computer is shooting off to Apple or Microsoft (assuming you give your permission), you can see one simple case in point.

In sum, Apple’s small step falls way short of the potential for interactive online educational tools, but it safeguards a certain proprietary model for publishing, which makes it palatable to educational behemoths. That’s a sound business move. If custom widgets open the door to real interactivity then it will also be a boon for students and independent learners. But unless it grows to include activity logging and meaningful assessment, this new platform will soon be supplanted by higher quality web-based services which will be just as accessible from mobile devices and even more useful for students and teachers.

The Holodeck is the Ideal Classroom. The wallodeck is closer to reality.

I had never heard of this guy before, but he beat me to the point I want to make today, which is that the holodeck is the platonic ideal of the classroom. In the holodeck, anything is possible*, you can experience virtual reality with all of your senses, it is completely safe to experiment, you can pause, slow down or rewind your experience, and there is a nearly omniscient computer available to answer any question you might have (but it will only do so if you ask or get really badly stuck). You can be in the holodeck alone or with real or virtual companions. The holodeck is so amazing that I can see why it would coexist with a war-less future.

* although the interaction is rule-based, the rules can be different from the usual rules.

Is Microsoft building one? Maybe. Is the CAVE sort of like it?

I’m not sure the 3D-realistic-haptic experience is that close on the horizon. Short of the real deal, here is how I envision a physics education wallodeck (2D and not haptic) within the next 5-10 years with the aid of artificial intelligence. It would take place on a surface (just don’t say SMART board) and enable you to:

  • draw objects or write text, and it would recognize objects, text labels, and equations. Examples of research on this are the now-defunct Microsoft center at Brown U.  and pen-based interactions at ISUE by Joe LaViola (a former graduate of the Brown center). Try this fun handwriting to LaTeX converter, if that means anything to you. I am deliberately glossing over the issue of representing 3D objects in 2D for now.
  • assign rules to the objects you create such as: rigid-body, interacts with gravity, objects that collide with it lose 80% of their energy, has no surface friction, slightly magnetic, etc… thus you can “turn on time” and animate anything. Commercial solid modelers like SolidWorks and Autodesk Inventor already do this. Arriving HD for the iPad uses this kind of thing in a game environment. And PheT simulations do it for objects that are predetermined and physics-topical. NetLogo is an amazing tool for writing code to model very simple dynamical systems.
  • describe the system with equation models and the computer will correct you if you make a mistake (assuming you want this feature on). Solve for unknowns as physics practice. (cf. Andes tutor and other Intelligent Tutoring Systems or ITS)
  • create problems, share them, or solve problems created by anyone (cf. LONCAPA) or created on the fly by the wallodeck program, obviating the need for withholding publication of problems to prevent copying graded assignments.
  • know your skill level, what the holes in your knowledge are, how you stack up, and what are the best next moves to help you improve. This is where more machine learning, educational data mining, cognitive science, applied psychological measurement and ITS research fits in. The problem of assessment still looms large.

If you are an animator and are interested in animating this wallodeck concept collaboratively, get in touch with me!

Two rules of data mining

Why no posts? I’m actually working on a paper called Item Response Theory in the Style of Collaborative Filtering. I wanted to write all about it here, but then I thought that might cause trouble if I haven’t finished at least a draft of the paper. Soon. In the meantime, I offer you this:

The Two Rules of Data Mining:
1) Less is more.