Tuesday, August 2, 2011

Next Generation of User Interactions - Microsoft Surface

User Interaction History

In Earlier days of computing, we used to work with command line Screens, where you need to have comprehensive knowledge about

the commands and not only that you have to keep them in mind exactly as they specified in the User guides. But later GUI was introduced and it brought User experience to a great level when it's compared to the earlier days. Now it's time to move from Graphical User Interfaces to Natural User Interfaces. It's about responding to User's natural reactions. Microsoft Surface and Kinet is all about providing capabilities to develop a

pplications for next generation, means develop Application with Natural User Interfaces.

Microsoft Surface

It's about experiencing 360 degree interfaces and with PixelSence, Microsoft Surface sees and responds to real world objects with more than 50 simultaneous inputs. This is been done with the new 40 inch Samsung SUR 40 for Microsoft Surface that can be used as a table, on the wall, embedded in fixture or furniture.

Microsoft Surface Home Page

What's this PixelSence is?

PixelSence allows a display to recognize fingers, hangs or any other objects placed on the screen, which provide vision based interaction without the use of cameras. This is been by each pixel, which sees what is been touched the screen a

nd pick it and transfer for processing.

It's like exactly what's happens in our eyes. Not what happens is Eye will pick all the images and transfer them into brain for recognizing and more processing.


How PixelSence works in detail,

  1. We touch the front Screen (Finger, Paper or any other object)
  2. IR backlight unit provides light through optical sheets, LCD and Protection glasses, and that hits the contacts, mean whatever touching the front screen
  3. Light reflected back from contact will be seen by the integrated sensors
  4. Sensors convert light signals into electrical signal/Value
  5. Values reported from all the sensors, will be used in formulating the real object placed on the screen
  6. The Picture is analyzed using image processing techniques
  7. The output is sent to PC, it includes the corrected sensor image and various contact types (fingers, blogs and tags)

What we can do with this as Software developers

It's about developing application which we can provide our customers to experience this amazing technology. Microsoft is providing a SDK called Microsoft Surface SDK and its first version was released bit time ago, and its latest release, Surface SDK 2.0 is available to download in the Microsoft web site. And you will find more developer and designer resources in Surface Design and Development site.

The new Surface SDK build upon,

  1. Embedded Windows 7 Professional 64 bit
  2. .NET Framework 4.0
  3. Windows Presentation Foundation 4.0
  4. Microsoft XNA Framework 4.0 – this is used in Game development on Microsoft XNA Game Studio 4.0.
  5. Windows PowerShell and DMTF DASH support, and enhanced administrator tools

Developer can do basic testing for their Surface Applications with Touch computers installed Windows 7 64 bit. Surface SDK will be replaced with Windows Surface Tool kit for Windows Touch beta later in this year, means everything will be delivered with Windows installations (Windows SDK most probably) later.

UI designers can design Surface app

lication with Microsoft Expression Blend 4 - http://msdn.microsoft.com/en-us/library/3a404b36-d8a4-4416-a4d4-1b9982596fe4#Blenderoo

Here you can find training for Surface developers - http://www.microsoft.com/surface/training20

/


Surface Architecture

This is the interesting part of the Surface, how it's look like Microsoft Surface Architecture.

This is the description of each layer,

Component

Description

Windows 7

Surface runs on the Windows 7 operating system. Windows 7 provides all the administrative, security, directory, and other Windows functionality (Bluetooth, WiFi, and so on) of the device made for Surface.

Developers and administrators who are working on a device made for Surface have full access to Windows functionality (inWindows mode). However, when users interact with Surface applications on a device made for Surface, the Windows user interface is completely suppressed (in Surface mode).

Hardware

The hardware of a device made for Surface includes a high definition, touch-sensitive display surface, and a computer that is running Windows 7. The hardware can capture physical points of touch, on or close to the screen, at 60 frames per second.

Vision System

The vision system software processes the video data that the hardware captures, and converts the raw video into data that you can access through Surface SDK APIs.

On devices made for Surface, a combination of a dedicated processor and touch-sensing technology sends data to your application through the Presentation layer and Core layer APIs.

Presentation and Core Layers

The Surface Software Development Kit (SDK) informs applications when touch points appear on the interactive surface device over the application window. As users touch the interactive surface and move around, the Surface SDK notifies applications so that applications can update their user interfaces.

For each touch point, applications can determine the position, orientation, bounding box, and central ellipse. For touch points that are made with tagged objects (which have tags printed on the bottom of the objects), applications can also determine the tag value.

The Surface SDK exposes two sets of APIs: the Presentation layer and the Core layer. You can use only one layer when you are developing a Surface application:

  • The Presentation layer integrates with Windows Presentation Foundation (WPF) and includes a suite of Surface-enabled controls.
  • You can use the Core layer with almost any user interface framework, including XNA 4.0.

For more information about the Presentation and Core layers, see Presentation Layer vs. Core Layer Applications.

Surface Shell

Surface Shell is the component that manages applications, windows, orientation, and user sessions; it also provides other functionality. Every Surface application must integrate with Surface Shell.

Surface and Windows Integration

The integration between Surface and the Windows operating system provides system-wide functionality on top of the Windows operating system. You must use this functionality to support unique aspects of the Surface experience, such as managing user sessions, switching between the Windows user interface (Windows mode) and the user experience (Surface mode), monitoring critical Surface processes, and handling critical failures.


Surface Overview - http://msdn.microsoft.com/en-us/library/ff727864.aspx

Windows Surface Blog - http://blogs.msdn.com/b/surface/

Applications - http://www.microsoft.com/surface/en/us/applicationpages.aspx

Forums - http://social.msdn.microsoft.com/Forums/en/surfaceappdevelopment/threads

There are number of videos you can enjoy from Microsoft

Experience things in new way - http://www.microsoft.com/surface/en/us/whatissurface.aspx

Power of PixelSence - http://www.microsoft.com/surface/en/us/pixelsense.aspx

Case Studies - http://www.microsoft.com/surface/en/us/casestudies.aspx

Lakmal Kankanamge

No comments:

Post a Comment