Sydewynder is an open-source SMS receiver and sender application written in Python for the Nokia S60 phones. It can automate the responses of messages and can be used as a mobile application server in areas where setting up a traditional server may be difficult or illegal. It also is very useful for prototyping mobile applications, such as games, without the burden of expensive hosting. As such, it works very well in educational settings. It even includes an emulator for developing scripts off of the phones.
Sydewynder requires the latest version of Python for the S60 (>= 1.4.0). If you are using earlier version of PyS60, please update your phone with the latest software.
To install Sydewynder, copy the contents inside the "sydewynder-x.x" directory into the E:\Python\ directory on your S60 phone (this should be the memory card). Sydewynder comes with "Pig Latin", "Ask Tom Cruise", and "We Feel Fine" as example scripts, as well as arcade.py, which will run all of them from a single server instance. Feel free to look at how the files are constructed and modify them for your own purposes. Pay special attention to the comments, as they will make developing new apps for Sydewynder much easier.
Scripts developed off-phone can be run like any other Python script if the syde_emu.py module is in the same directory. When you run your script from the command line, a crude emulator will appear and guide you through a typical interaction between cell phone users and the Sydewynder app you have created.
This project was featured on the front page of the CDT department website.
It was also part of Paul Notzold's Ask Tom piece, which was on display at the Chelsea Art Museum for the Parsons 10 Years Running show. Participants were asked to text a question to a number and received back a random (or is it?!) quote from a famous celebrity whose name rhymes with Bomb Booze. The phone "server" running Sydewynder stayed up for about two weeks straight without much of a problem.
If you're using Sydewynder, be sure to drop a comment on the Sourceforge forum and let us know what you've done with it. And be sure to post there if you need help or run into any bugs.
Sydewynder is copyright 2007 Mike Edwards and is licensed to you under the GPL version 2.0. "Ask Tom" was developed with Paul Notzold. "We Feel Fine" uses the amazing wefeelfine.org API to work its magic.
Since I'm still on winter break, I took the better part of today to mess around with my new (well, secondhand) Kinect. I used the excellent freenect library and its Python bindings to whip up this simple little demo.
There's a lot here that is very similar to how we work in SMALLab, especially how we deal with a "buttonless" interface. Using the Z-axis (forward and back on the Kinect, up and down in SMALLab) makes for a pretty nice way of changing from active to inactive state.
I'm looking forward to playing with this some more in the future. It's not the most response system in the world, but I can see simple and casual uses for this kind of interaction in schools, public displays, etc.
If you picked up a copy of an actual newspaper this weekend (I had to make a special trip for it), you may have seen the cover of the New York Times Magazine with an odd pie chart on it. This is SMALLab, one of the research projects that I've been working with for a long time alongside other Parsons faculty and students.
I thought I'd add a small contribution by breaking down what's on the cover here for posterity. In the top right, Kees is holding one of the mocap controllers that the students made in the classroom at the beginning of last year. It's a big Styrofoam ball with four chopsticks poked into it. Each chopstick is topped with a retroreflective ball, which is what we use to track the position of the controller when we shine infrared light on it from each of the twelve cameras in our OptiTrack system.
We loved these controllers so much that, instead of being temporary tools for teaching about the system, we've kept them for over a year now and have even moved them into Quest's new location with us. If you look very closely, you can see the first name of Claudio Midolo, another of SMALLab's researchers who helped design the controller-building exercise (and helped put this one together--hence deserving his name on it.)
In the bottom right corner is an example of one of the custom controllers we've made to handle a variety of scenarios in SMALLab. This one is a paddle I designed and built for the Raft scenario. It's one of a series of controllers that Claudio and I made, which also included forms that looked like pumps and metal detectors.
The whole projection image is something Kyle Li put together. It's not actually a real scenario--he built a fully functioning one that is much more interesting to look at (as well as being playable and fun.) But the Times had this gigantic photography rig over the SMALLab mats on the day they shot, so the motion tracking wasn't going to do much of anything with that hulk in the way.
Anyway, that's just of bit of the back story behind the image you see there. We're continuing to develop things with all new controllers and additions to the technology, in many new and exciting subjects, so expect our next SMALLab cover model to be even cooler.
Here is the latest in my continuing series on analyzing Twitter conference backchannels by their hashtags and replies/retweets. This one, though, is a bit different and special... because I was actually at the conference! Below is my breakdown of Games + Learning + Society 2009 via the #gls and #gls09 hashtags.
The positions of the glowballs, and other information coming from SCREM, can be relayed to a little Nokia N800 palmtop over the WiFi connection.
Tracking is still a little rough, and, as I mention in the video, I think the IR cams are grabbing the N800 a little bit, so we'll need to keep it outside of the area--at least until the tracker is a bit more solid. Still, it should be a handy way to bring in another interface--and other participants--inside SMALLab. In fact, I gave the N800 to James in the office, and he could watch the movements of the balls through the wall.
I'm sure that's useful somehow.
Here is a quick video explanation of the HitArea and DragArea render engines.
The engines can track when a pointer enters and leaves them, and the DragArea engine also reports back the pointer's relative position--handy for sliders, etc.
Here's more work on faking three dimensions on a flat plane.
Again, this only works from the perspective of the ball, so one person needs to hold it fairly near their point of view to get the effect. And, from the brief demo we did at the workshop, sometimes people need a few minutes to adjust to the effect before they feel like they're navigating space.
At this point, it seems like more of a fun demo or a parlor trick. One really good suggestion we got recently, though, was that we could give participants a button, say on a Wiimore, that would allow them to assume the first person perspective if more than one person is navigating the space. That might help share the experience a bit better.
We were experimenting with different ideas for how to represent the third dimension you have access to with the SMALLab installation. This is a trial using a camera viewpoint that tracks the position of the ball as you move through a space with it. It fools you into thinking that a 3d world is changing at your feet.
It's not a perfect solution. It only works for one person, and there are limits to how high you can pretend the Z-axis goes. Still, it's an interesting way of looking at the problem, and it might open more doors later on.
Copyright Mike Edwards 2006-2009. All content available under the Creative Commons Attribution ShareAlike license, unless otherwise noted.