"Next year, people will share twice as much information as this year"


Affectionately termed 'Zuck's Law', this quote from a 'young' Mark Zuckerberg was presented to us during a keynote from Facebook's Mark Tonkelowitz after a thrilling day at the Human Computer Interaction Design Open day.

The day was free to sign up for and well worth the day out. Some familiar faces too from the London HCI and UX scenes were also present.

There were many great presentations for the day, but I'd like to focus on two in this post.

Bananas


The first, from Adrian Westaway entitled "Bananas, Technology and Magic", highlights how informed research and open thinking can lead to extraordinary solutions. 

In his presentation Adrian took us through an award-winning project designed to create a mobile phone product for over 60's users.

The key insight was not to focus on disabilities and health issues (like some phone manufacturers have done with their big red SOS buttons), but to focus on the vast majority of this age group who have no physical or mental conditions prohibiting their use of mobile phones, but who grew up in an analogue world and have a natural fear of all things digital.

The resulting solution was not a new mobile phone, but 3 magical ideas that cleverly remove the fear of unboxing a digital product, reduce the complexity of a user manual, reveal the hidden functionality, and enhance the ongoing use of todays mobile phones.

The presentation:



Idea 1: Unboxing:
http://www.youtube.com/watch?v=cYfSKGjHBKg

Full Article:
http://www.hhc.rca.ac.uk/2261-2270/all/1/Out_of_the_Box.aspx#


The Lab




The second presentation, from the HCID team, was a tour of the Interaction Lab.

Being a tech-geek, this was the most exciting for me, and were it not for Adrian's remarkable story (above), it would have been the stand-out part of the day.

In each corner of the lab was a piece of technology I had only previously seen from afar – the Microsoft Surface (table, not the new tablet), a 3D Printer, Eye-tracking machines, Brain-tracking machines (!), and of course the Microsoft Kinect.

Then there was some more unusual tech aimed at digitising collaboration within product teams: 

A ceiling-mounted projector which beams an image onto a horizontal whiteboard and scans the drawings people make.

A table-mounted tracker detecting your pen movements with sound-waves – a potentially useful (and portable) evolution for collaborative design.

It was all very experimental, all very nifty. But I found myself questioning if this tech is ready for the mass market.

After a quick pose in front of the Kinect, it was on to the demonstrations.

Eye-tracking


As an advocate for UX research methods, I was keen to experience the eye-tracking systems and how it feels to take part. So when the call came for a volunteer from the 20-strong group, I duly snapped it up.

The first step was to configure the scanner to accurately detect my eye movements. After a few seconds looking at various dots on the screen, it decided that my right eye was not as good as my left, but otherwise we were good to go.

The task was to find the date and time of the Olympics Basketball final from the homepage of the London 2012 website.

As I started, I became very aware that for the first time in my life, other people could see exactly what I was looking at. It's a strange feeling and I was determined to only look at the right spots. This almost certainly skewed the results (in my favour), but I suspect after a longer period I would have become more relaxed.

Visualisation




Naturally though, and despite high expectations for myself, my speed and accuracy was around average for that test.   

Of course the point of the test is not speed, it is to see where I naturally expect to find things on a page, and where I looked before making my decision to click.

Although I was certain I had gone straight through with ease, the eye path visualisations it churned out told a different story entirely.

Its fascinating stuff, and after a number of similar tests with different people, I could see how useful the build-up of data would be.

I recommend seeking out one of these machines and trying it for yourself (at a show or event, or even in solving some of your own design challenges).

It was by far the most practical piece of kit available in the Interaction Lab and it really shows just how much goes into those endless Neilson Norman reports, and reveals how much there is to know about our thought processes when navigating the web.

Back to Facebook


The keynotes in the evening (from Mark Tonkelowitz – Facebook, Matthew Cockerill – Seymourpowell, Juho Parviainen – IDEO) were all fascinating too. It's well worth catching these guys if they are doing an event near you.

I will leave you with a final point from Mark Tonkelowitz:

At Facebook they have a lot of users. This means they can role out new features, iterative updates and A/B tests to a tiny fraction of their user base and still reach millions of users (if only we were all so lucky).

To decide who to use for these tests, they look at countries who's people tend to have fewest connections outside of their country. The people in these countries would then become guinea pigs for new updates.

Care to guess which countries they are?

Venezuela and New Zealand. Elvia and Rosie, that explains a lot.