A Platform for location based Augmented Reality Applications
ADITYA INSTITUTE OF TECHNOLOHY AND MANAGEMENT …..PRESENTED BY…..
S. Sai Sateesh, III/IV B.TECH (E.C.E), AITAM, Tekkali.
[email protected]
N.G.R.Reddy, III/IV B.TECH (E.C.E), AITAM, Tekkali.
[email protected]
Abstract:
Augmented Reality enhancing (AR), a
everywhere. This paper a Reality describes our work to build mobile system Augmented Reality annotating computer generated entities, powerful interface paradigm allowing users to interact with is a user (AR), the 3D interwith is Augmented that supports true stereoscopic and pad graphics, a pen and face and direct interaction virtual The and inand of user objects. system from
resources at any location and at
Introduction and work : related
any time. AR is often used as an user wearable computing because provides information space which is continuously and transparently accessible Information be hands-free, . can and it an interface in technique
user’s perception of the real world with entities, mobile computing, allowing users to access manipulate formation anytime independent emerging interface technologies that show promise. combination great The of computer generated
real world with
assembled off-the-shelf hardware
accessed
components and serves as a basic test bed for user interface experiments related computer supported collaborative work Augmented Reality. It also describes some we of based in to
location, are two
the user’s view of the real-world is computers in a natural way. Mobilizing such an interface by deploying wearable computers is a logical extension as the body of related shows. Wearable computing allows the user to access research If combined location possible. computer transparently changes behavior its based these with aware are The technologies are position tracking, applications not interrupted, a requirement for continuous use.
both into a single system makes the power computer enhanced interaction and communication in the real world accessible anytime and of
applications the area
are developing in location computing.
on environment without user’s
the the
for building our own. On to one
GeForce2Go video chip. The device has a 1GHZ processor and runs under Windows
The
mobile
hand this allows us quickly old upgrade
AR setup :
While computational power stereoscopic rendering is and computer vision becoming in available for the
intervention. An impressive demonstrator for mobile both a location headaware AR using mounted and a hand-held display is Columbia’s was Touring Machine [3] which campus information system situated documentaries [4]. and used to create a
devices or add new ones and to change configuration easily. not On the the and system other hand we do obtain smallest lightest possible. the
2000. We also added a wireless LAN adapter enable communication with stationary or a second our setup future mobile network to the to
mobile computer systems, the size and weight of such systems is still not optimal. Nevertheless, our setup is solely build from offthe-shelf hardware components to avoid the effort and time required
note-book
Hardware :
The most
powerful portable graphics solution available currently is a PC notebook equipped with a NVidia
unit. It is carried by the user in a backpack. As an output see-
device, we use an i-glasses
through stereoscopic color HMD. The display is fixed to a helmet worn by the Moreover, InterSense InterTrax2 orientation sensor and camera of props helmet. a web for interaction are user. an
provide
more
with user elements registered the pad.
virtual interface with
stationary workspace, our mobile setup with bodystabilized display allows to arrange 3D in a information wearable that along a user.
accurate tracking on the pad itself. Figure 1 gives an overview of the setup.
objects or with
and displayed on
workspace
User interface management software :
As our software platform we use Studierstube 2.1 [5], a user interface management system for AR based on but not limited stereoscopic graphics. user, application environment, and supports a variety of display devices including stereoscopic HMDs. It also provides the means of 6DOF interaction, either to 3D It multi-
travels with
Applications stay where they are put relative to the user, easily accessed anytime, aided by proprioception Applications are implemented as runtime loadable objects executing in designated volumetric containers, a kind of 3D window equivalent. While the stationary Studierstube environment allowed a user to arrange multiple application in a original and spatial memory. Figure 2 shows a simple painting application. and can be
fiducial tracking
mounted on the
The main user interface is a pen and pad setup tablet are camera markers. the pen using a Wacom graphics devices by the and its pen. Both optically tracked using of
provides a multi-
The 2D position (provided by the Wacom tablet) is incorporated into the processing to
user’s locale will Figure 2. A user interacting with the application user. Our interface management system capable managing multiple locales, which contain number Locales important multi-user multi-display operation. example, For each can any of are for or is also of user paint . be unaffected, but the second user will be able to see movement of the graphical objects contained in the first user’s locale. effective collaboration, it will in most cases be necessary to add that graphical applications that both users should work with. a third contains stationary locale For
passes through a series of steps. It is generated by tracking hardware, read by device drivers, transformed to fit the requirements of the application and send over to hosts. by a network connections other handled library
data
flow
network of the transformations. The framework's design is based on XML, taking full advantage of this technology new by
The view of the
allowing the use of standard XML tools development, configuration and documentation. OpenTracker uses a vision library tracking for
These tasks are called
OpenTracker [6], an open software architecture the tracking for different input and
graphical objects.
called ARToolkit [7] to implement the tracking of the fiducial markers on the interaction props. It analyses the video images camera and the pad delivered by the web helmet establishes pen and mounted to the
tasks involved in devices processing AR multimodal input data. The concept main behind is data individual
Tracking :
Mobile requires significantly more complex VR tracking than a traditional application. In a typical VR or AR application tracking data
mobile user will require a separate wear-able workspace locale (coordinate system). As one user moves about, a second that defines a distinct
OpenTracker whole these
to break up the manipulation into steps and build a
position of the
relative
to
the
Location based AR applications :
when again the book
the
user
users head.
looks at them and displays correct in the
Location tracking :
A similar
Building on the mobile platform described above we are currently developing number prototype location Augmented Reality applications. These applications based on are the based a of
location of the library shelves. A bookshelf was fit out with fiducial markers used for tracking. Then bookshelf’s position can be computed by the tracking library. Dedicated books were rigged as well with these markers, so that the system recognizes such a book when the user is looking at it. In the the are Figure 3. The correct location of a detected is A book displayed. prototype application, markers the
technique is used to track the user’s position Our within the environment. laboratory and neighboring rooms are rigged with markers the walls. larger along The are and the
locations of these markers measured model building. Together with the tracking information delivered by the fiducial tracking the computes users within system the position these of
location tracking described in the last section. A simple location based application is the AR library. It performs two basic tasks : Firstly, it shows a user the location of a requested book in the vast bookshelves of a library. secondly, And it
incorporated in a
attached to the wall instead of a real shelf. Figure 3 shows both modes.
rooms from the detected markers.
recognizes books
selected book is
shown shelf.
in
the
of environment
the as
environment user’s are
on
within this real environment. The way finding will application
the tablet. The location and current room highlighted. by She can select a destination clicking into the room she wants to go to. Then the system computes the shortest path to this room and highlights to doors that the cross. she rooms she needs Additionally the needs to take are augmented in the user’s view to guide her along the path to the destination.
well as a means to track the user’s Another scenario mobile finding application. The aim is to guide a user along a path to a selected destination. This is accomplished using two means: a of world in the miniature model environment with the users location and pathhighlighted and augmenting the user’s view with navigation highguides such as arrows, lighted doors and lines along the desired path. Such application requires a model an In the application itself the user is presented with a miniature model of the typical for AR location As above prepared environment within described we the to the environment.
be ex-tended to encompass a part of our building to allow the user to roam in a larger environment. The integration of both applications is straightforward because of the multi application features of the Studierstube system. This will allow the user to find her way to the library and then library application place. in use the
systems is a way
allow the system to compute this. For each room a set of their markers locations The the location was set up and measured. establish user’s
tracking can now
and the direction she is looking in. Thus the system can continuously display navigation information registered to the real world.
Future work :
The prototype are to
Conclusion:
This describes a mobile paper our AR that locationmost
applications We plan
not finished yet. augment a real library and test the application
work to develop platform allows While
based computing.
related focuses providing information text or overlays, concentrate 3D
work on as 2D we on
Azuma R.: A Survey of Augment ed Reality. pp. 385, August 1997. Starner T., S. Mann, B. Rhodes, J. Levine, J. Healey, D. Kirsch, R. Picard, Presence, Vol. 6, 355-
information with.
that the user can interact First we describe the mobile setup itself used consisting and the of the hardware software system developed. Then we describe two prototype applications are developing demonstrate platform. we to the currently
No. 4, pp. 386-398, August 1997.
abilities of the
References: