WO2002033688A2 - Dynamic integration of computer generated and real world images - Google Patents

Dynamic integration of computer generated and real world images Download PDF

Info

Publication number
WO2002033688A2
WO2002033688A2 PCT/US2001/031986 US0131986W WO0233688A2 WO 2002033688 A2 WO2002033688 A2 WO 2002033688A2 US 0131986 W US0131986 W US 0131986W WO 0233688 A2 WO0233688 A2 WO 0233688A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
recited
computer
real world
Prior art date
Application number
PCT/US2001/031986
Other languages
French (fr)
Other versions
WO2002033688B1 (en
WO2002033688A3 (en
Inventor
Kenneth H. Abbott, Iii
Dan Newell
James O. Robarts
Original Assignee
Tangis Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tangis Corporation filed Critical Tangis Corporation
Priority to AU2002211698A priority Critical patent/AU2002211698A1/en
Publication of WO2002033688A2 publication Critical patent/WO2002033688A2/en
Publication of WO2002033688A3 publication Critical patent/WO2002033688A3/en
Publication of WO2002033688B1 publication Critical patent/WO2002033688B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present invention is directed to controlling the appearance of information presented on displays, such as those used in conjunction with wearable personal computers. More particularly, the invention relates to transparent graphical user interfaces that present information transparently on real world images to minimize obstructing the user's view of the real world images.
  • computers become increasingly powerful and ubiquitous, users increasingly employ their computers for a broad variety of tasks. For example, in addition to traditional activities such as running word processing and database applications, users increasingly rely on their computers as an integral part of their daily lives. Programs to schedule activities, generate reminders, and provide rapid communication capabilities are becoming increasingly popular. Moreover, computers are increasingly present during virtually all of a person's daily activities. For example, hand-held computer organizers (e.g., PDAs) are more common, and communication devices such as portable phones are increasingly incorporating computer capabilities. Thus, users may be presented with output information from one or more computers at any time.
  • PDAs personal digital assistants
  • the user cannot view the computer-generated information at the same time as the real-world information. Rather, the user is typically forced to switch between the real world and the virtual world by either mentally changing focus or by physically actuating some switching mechanism that alters between displaying the real world and displaying the virtual word. To view the real world, the user must stop looking at the display of virtual information and concentrate on the real world. Conversely, to view the virtual information, the user must stop looking at the real world.
  • Switching display modes in this way can lead to awkward, or even dangerous, situations that leave the user in transition and sometimes in the wrong mode when they need to deal with an important event.
  • An example of this awkward behavior is found in inadequate current technology of computer displays that are worn by users.
  • Some computer hardware is equipped with an extra piece of hardware that flips down behind the visor display. This effect creates complete background opaqueness when the user needs to view more information, or needs to view it without the distraction of the real-world image.
  • a system is provided to integrate computer-generated virtual information with real world images on a display, such as a head-mounted display of a wearable computer.
  • the system presents the virtual information in a way that creates little interference with the user's view of the real world images.
  • the system further modifies how the virtual information is presented to alter whether the virtual information is more or less visible relative to the real world images. The modification may be made dynamically, such as in response to a change in the user's context, or user's eye focus on the display, or a user command.
  • the virtual information may be modified in a number of ways.
  • the virtual information is presented transparently on the display and overlays the real world images.
  • the user can easily view the real world images through the transparent information.
  • the system can then dynamically adjust the degree of transparency across a range from fully transparent to fully opaque depending upon how noticeable the information is to be displayed.
  • the system modifies the color of the virtual information to selectively blend or contrast the virtual information with the real world images. Borders may also be drawn around the virtual information to set it apart. Another way to modify presentation is to dynamically move the virtual information on the display to make it more or less prominent for viewing by the user.
  • Fig. 1 illustrates a wearable computer having a head mounted display and mechanisms for displaying virtual information on the display together with real world images.
  • Fig. 2 is a diagrammatic illustration of a view of real world images through the head mounted display.
  • the illustration shows a transparent user interface (UI) that presents computer-generated infoimation on the display over the real world images in a manner that minimally distracts the user's vision of the real world images.
  • UI transparent user interface
  • Fig. 3 is similar to Fig. 2, but further illustrates a transparent watermark overlaid on the real world images.
  • Fig. 4 is similar to Fig. 2, but further illustrates context specific information depicted relative to the real world images.
  • Fig. 5 is similar to Fig. 2, but further illustrates a border about the information.
  • Fig. 6 is similar to Fig. 2, but further illustrates a way to modify prominence of the virtual information by changing its location on the display.
  • Fig. 7 is similar to Fig. 2, but further illustrates enclosing the information within a marquee.
  • Fig. 8 shows a process for integrating computer-generated information with real world images on a display.
  • Described below is a system and user interface that enables simultaneous display of virtual information and real world information with minimal distraction to the user.
  • the user interface is described in the context of a head mounted visual display (e.g., eye glasses display) of a wearable computing system that allows a user to view the real world while overlaying additional virtual information.
  • the user interface may be used for other displays and in contexts other than the wearable computing environment.
  • Fig. 1 illustrates a body-mounted wearable computer 100 worn by a user 102.
  • the computer 100 includes a variety of body- worn input devices, such as a microphone 1 10, a hand-held flat panel display 112 with character recognition capabilities, and various other user input devices 1 14.
  • Examples of other types of input devices with which a user can supply information to the computer 100 include voice recognition devices, traditional qwerty keyboards, chording keyboards, half qwerty keyboards, dual forearm keyboards, chest mounted keyboards, handwriting recognition and digital ink devices, a mouse, a track pad, a digital stylus, a finger or glove device to capture user movement, pupil tracking devices, a gyropoint, a trackball, a voice grid device, digital cameras (still and motion), and so forth.
  • the computer 100 also has a variety of body-worn output devices, including the hand-held flat panel display 112, an earpiece speaker 1 16, and a head-mounted display in the form of an eyeglass-mounted display 118.
  • the eyeglass-mounted display 118 is implemented as a display type that allows the user to view real world images from their surroundings while simultaneously overlaying or otherwise presenting computer-generated information to the user in an unobtrusive manner.
  • the display may be constructed to permit direct viewing of real images (i.e., permitting the user to gaze directly through the display at the real world objects) or to show real world images captured from the surroundings by video devices, such as digital cameras.
  • video devices such as digital cameras.
  • Other output devices 120 may also be incorporated into the computer 100, such as a tactile display, an olfactory output device, tactile output devices, and the like.
  • the computer 100 may also be equipped with one or more various body- worn user sensor devices 122.
  • sensors can provide information about the current physiological state of the user and current user activities. Examples of such sensors include thermometers, sphygmometers, heart rate sensors, shiver response sensors, skin galvanometry sensors, eyelid blink sensors, pupil dilation detection sensors, EEG and EKG sensors, sensors to detect brow furrowing, blood sugar monitors, etc.
  • sensors elsewhere in the near environment can provide information about the user, such as motion detector sensors (e.g., whether the user is present and is moving), badge readers, still and video cameras (including low light, infra-red, and x-ray), remote microphones, etc
  • motion detector sensors e.g., whether the user is present and is moving
  • badge readers e.g., whether the user is present and is moving
  • still and video cameras including low light, infra-red, and x-ray
  • remote microphones e.g., a microphones, etc.
  • sensors can be both passive (i.e., detecting information generated external to the sensor, such as a heart beat) or active (i.e., generating a signal to obtain information, such as sonar or x-rays).
  • the computer 100 may also be equipped with various environment sensor devices 124 that sense conditions of the environment surrounding the user.
  • devices such as microphones or motion sensors may be able to detect whether there are other people near the user and whether the user is interacting with those people.
  • Sensors can also detect environmental conditions that may affect the user, such as air thermometers or geigercounters.
  • Sensors either body-mounted or remote, can also provide information related to a wide variety of user and environment factors including location, orientation, speed, direction, distance, and proximity to other locations (e.g., GPS and differential GPS devices, orientation tracking devices, gyroscopes, altimeters, accelerometers, anemometers, pedometers, compasses, laser or optical range finders, depth gauges, sonar, etc.).
  • Identity and informational sensors e.g., bar code readers, biometric scanners, laser scanners, OCR, badge readers, etc.
  • remote sensors e.g., home or car alarm systems, remote camera, national weather service web page, a baby monitor, traffic sensors, etc.
  • the computer 100 further includes a central computing unit 130 that may or may not be worn on the user.
  • the various inputs, outputs, and sensors are connected to the central computing unit 130 via one or more data communications interfaces 132 that may be implemented using wire-based technologies (e.g., wires, coax, fiber optic, etc.) or wireless technologies (e.g., RF, etc.).
  • wire-based technologies e.g., wires, coax, fiber optic, etc.
  • wireless technologies e.g., RF, etc.
  • the central computing unit 130 includes a central processing unit (CPU) 140, a memory 142, and a storage device 144.
  • the memory 142 may be implemented using both volatile and non- volatile memory, such as RAM, ROM, Flash, EEPROM, disk, and so forth.
  • the storage device 144 is typically implemented using non-volatile permanent memory, such as ROM, EEPROM, diskette, memory cards, and the like.
  • One or more application programs 146 are stored in memory 142 and executed by the CPU 140.
  • the application programs 146 generate data that may be output to the user via one or more of the output devices 1 12, 1 16, 118, and 120.
  • a transparent user interface (UI) component 148 that is designed to present computer- generated information to the user via the eyeglass mounted display 1 18 in a manner that does not distract the user from viewing real world parameters.
  • the transparent UI 148 organizes orientation and presentation of the data and provides the control parameters that direct the display 1 18 to place the data before the user in many different ways that account for such factors as the importance of the information, relevancy to what is being viewed in the real world, and so on.
  • a Condition-Dependent Output Supplier In the illustrated implementation, a Condition-Dependent Output Supplier
  • CDOS computer-based system 150
  • the CDOS system 148 monitors the user and the user's environment, and creates and maintains an updated model of the current condition of the user. As the user moves about in various environments, the CDOS system receives various input infoimation including explicit user input, sensed user information, and sensed environment information. The CDOS system updates the current model of the user condition, and presents output information to the user via appropriate output devices.
  • the CDOS system 150 provides information that might affect how the transparent UI 148 presents the information to the user. For instance, suppose the application program 146 is generating geographical or spatial relevant information that should only be displayed when the user is looking in a specific direction. The CDOS system 150 may be used to generate data indicating where the user is looking. If the user is looking in the correct direction, the transparent UI 148 presents the data in conjunction with the real world view of that direction. If the user turns his/her head, the CDOS system 148 detects the movement and informs the application program 146, enabling the transparent UI 148 to remove the information from the display.
  • the body-mounted computer 100 may be connected to one or more networks of other devices through wired or wireless communication means (e.g., wireless RF, a cellular phone or modem, infrared, physical cable, a docking station, etc.).
  • wired or wireless communication means e.g., wireless RF, a cellular phone or modem, infrared, physical cable, a docking station, etc.
  • the body-mounted computer of a user could make use of output devices in a smart room, such as a television and stereo when the user is at home, if the body-mounted computer can transmit information to those devices via a wireless medium or if a cabled or docking mechanism is available to transmit the information.
  • kiosks or other information devices can be installed at various locations (e.g., in airports or at tourist spots) to transmit relevant information to body-mounted computers within the range of the information device.
  • Fig. 2 shows an exemplary view that the user of the wearable computer 100 might see when looking at the eyeglass mounted display 118.
  • the display 118 depicts a graphical screen presentation 200 generated by the transparent UI 148 of the application program 146 executing on the wearable computer 100.
  • the screen presentation 200 permits viewing of the real world surrounding 202, which is illustrated here as a mountain range.
  • the transparent screen presentation 200 presents information to the user in a manner that does not significantly impede the user's view of the real world 202.
  • the virtual information consists of a menu 204 that lists various items of interest to the user.
  • the menu 204 includes context relevant information such as the present temperature, current elevation, and time.
  • the menu 204 may further include navigation items that allow the user to navigate to various levels of information being monitored or stored by the computer 100.
  • the menu items include mapping, email, communication, body parameters, and geographical location.
  • the menu 204 is placed along the side of the display to minimize any distraction from the user's vision of the real world.
  • the menu 204 is presented transparently, enabling the user to see the real world images 202 behind the menu.
  • the transparent UI possesses many features that are directed toward the goal of displaying virtual information to the user without impeding too much of the user's view of the real world. Some of these features are explored below to provide a better understanding of the transparent UI.
  • the transparent UI 148 is capable of dynamically changing the transparency of the virtual information.
  • the application program 146 can change the degree of transparency of the menu 204 (or other virtual objects) by implementing a display range from completely opaque to completely transparent. This display range allows the user to view both real world and virtual-world information at the same time, with dynamic changes being performed for a variety of reasons.
  • One reason to change the transparency might be the level of importance ascribed to the information. As the information is deemed more important by the application program 146 or user, the transparency is decreased to draw more attention to the information.
  • Another reason to vary transparency might be context specific. Integrating the transparent UI into a system that models the user's context allows the transparent UI to vary the degree of transparency in response to a rich set of states from the user, their environment, or the computer and its peripheral devices. Using this model, the system can automatically determine what parts of the virtual information to display as more or less transparent and vary their respective transparencies accordingly.
  • the application program may decrease the transparency toward the opaque end of the display range to increase the noticeability of the information for the user. Conversely, if the information is less relevant for a given context, the application program may increase the transparency toward the fully transparent end of the display range to dimmish the noticeability of the virtual information.
  • mapping program may display directional graphics when the user is looking in one direction and fade those graphics out (i.e., make them more transparent) when the user moves his/her head to look in another direction.
  • the virtual object's transparency increases as the user no longer focuses on the object.
  • the transparency may further be configured to change over time, allowing the virtual image to fade in and out depending on the circumstances. For example, an unused window can fade from view, becoming very transparent or perhaps eventually fully transparent, when the user maintains their focus elsewhere. The window may then fade back into view when the user attention is returned to it.
  • comparatively important virtual objects like those used for control, status, power, safety, etc.
  • the user may configure the system to never fade specified virtual objects.
  • This type of configuration can be performed dynamically on specific objects or by making changes to a general system configuration.
  • the transparent UI can also be controlled by the user instead of the application program. Examples of this involve a visual target in the user interface that is used to adjust transparency of the virtual objects being presented to the user.
  • this target can be a control button or slider that is controlled by any variety of input methods available to the user (e.g., voice, eye-tracking controls to control the target/control object, keyboard, etc.).
  • the transparent UI 148 may also be configured to present faintly visible notifications with high transparency to hint to the user that additional infoimation is available for presentation.
  • the notification is usually depicted in response to some event about which an application desires to notify the user.
  • the faintly visible notification notifies the user without disrupting the user's concentration on the real world surroundings.
  • the virtual image can be formed by manipulating the real world image, akin to watermarking the digital image in some manner.
  • Fig. 3 shows an example of a watermark notification 300 overlaid on the real world image 202.
  • the watermark notification 300 is a graphical envelope icon that suggests to the user that new, unread electronic mail has been received.
  • the envelope icon is illustrated in dashed lines around the edge of the full display to demonstrate that the icon is faintly visible (or highly transparent) to avoid obscuring the view of the mountain range.
  • the user is able to see through the watermark due to its partial transparency, thus helping the user to easily focus on the current task.
  • the notification may come in many different shapes, positions, and sizes, including a new window, other icon shapes, or some other graphical presentation of information to the user.
  • the watermark notification can be suggestive of a particular task to orient the user to the task at hand (i.e., read mail).
  • the application program 146 can decrease the transparency of the information and make it more or less visible.
  • Such information can be used in a variety of situations, such as incoming information, or when more information related to the user's context or user's view (both virtual and real world) is available, or when a reminder is triggered, or anytime more information is available than can be viewed at one time, or for providing "help".
  • Such watermarks can also be used for hinting to the user about advertisements that could be presented to the user.
  • the watermark notification also functions as an active control that may be selected by the user to control an underlying application. When the user looks at the watermark image, or in some other way selects the image, it becomes visibly opaque.
  • the user's method for selecting the image includes any of the various ways a user of a wearable personal computer can perform selections of graphical objects (e.g., blinking, voice selection, etc.).
  • the user can configure this behavior in the system before the commands are given to the system, or generate the system behaviors by commands, controls, or corrections to the system.
  • the application program provides a suitable response.
  • user selection of the envelope icon 300 might cause the email program to display the newly received email message.
  • the transparent UI may also be configured to present information in different degrees of transparency depending upon the user's context.
  • the application program 146 may be provided with context data that influences how the virtual information is presented to the user via the transparent UI.
  • Fig. 4 shows one example of presenting virtual information according to the user's context.
  • this example illustrates a situation where the virtual information is presented to the user only when the user is facing a particular direction.
  • the user is looking toward the mountain range.
  • Virtual information 400 in the form of a climbing aid is overlaid on the display.
  • the climbing aid 400 highlights a desired trail to be taken by the user when scaling the mountain.
  • the trail 400 is visible (i.e., a low degree of transparency) when the user faces in a direction such that the particular mountain is within the viewing area.
  • the trail remains indexed to the appropriate mountain, effectively moving across the screen at the rate of the head rotation.
  • the computer 100 will sense that the user is looking in another direction. This data will be input to the application program controlling the trail display and the trail 400 will be removed from the display (or made completely transparent). In this manner, the climbing aid is more intuitive to the user, appearing only when the user is facing the relevant task.
  • This is just one example of modifying the display of virtual information in conjunction with real world surroundings based on the user's context. There are many other situations that may dictate when virtual information is presented or withdrawn depending upon the user's context.
  • Borders are drawn around objects to provide greater control of transparency and opaqueness.
  • Fig. 5 illustrates the transparent UI 200 where a border 500 is drawn around the menu 204.
  • the border 500 draws a bit more attention to the menu 204 without noticeably distracting from the user's view of the real world 202.
  • Graphical images can be created with special borders embedded in the artwork, such that the borders can be used to highlight the virtual object.
  • Certain elements of the graphical information can also be given different opaque curves relating to visibility.
  • the border 500 might be assigned a different degree of transparency compared to the menu items 204 so that the border 500 would be the last to become fully transparent as the menu's transparency is increased. This behavior leaves the more distinct border 500 visible for the user to identify even after the menu items have been faded to nearly full transparency, thus leaving the impression that the virtual object still exists.
  • This feature also provides a distinct border, which, as long as it is visible, helps the user locate a virtual image, regardless of the transparency of the rest of the image.
  • another feature is to group more than one related object (e.g., by drawing boxes about them) to give similar degrees of transparency to a set of objects simultaneously.
  • Marquees are one embodiment of object borders. Marquees are dynamic objects that add prominence beyond static or highlighted borders by flashing, moving (e.g.: cycling), or blinking the border around an object. These are only examples of the variety of ways a system can highlight virtual information so the user can more easily notice when the information is overlaid on top of the real- world view.
  • the application program may be configured to automatically detect edges of the display object.
  • the edge information may then be used by the application program to generate object borders dynamically.
  • Another technique for displaying virtual information in a manner that reduces the user's distraction from viewing of the real world is to change colors of the virtual objects to control their transparency, and hence visibility, against a changing real world view.
  • a user interface containing virtually displayed information such as program windows, icons, etc. is drawn with colors that clash with, or blend into, the background of real-world colors, the user is unable to properly view the information.
  • the application program 146 can be configured to detect conflict of colors and re-map the virtual-world colors so the virtual objects can be easily seen by the user, and so that the virtual colors do not clash with the real-world colors. This color detection and re-mapping makes the virtual objects easier to see and promotes greater control over the transparency of the objects.
  • color re-mapping might further involve mapping a current virtual- world color-set to a smaller set of colors.
  • the need for such reduction can be detected automatically by the computer or the user can control all configuration adjustments by directing the computer to perform this action.
  • Another technique for presenting virtual information concurrently with the real world images is to manipulate the transparency of the background of the virtual information.
  • the visual backgrounds of virtual information can be dynamically displayed, such that the application program 146 causes the background to become transparent. This allows the user of the system to view more of the real world.
  • the application affords greater flexibility to the user for controlling the presentation of transparent information and further aids application developers in providing flexible transparent user interfaces.
  • Prominence is a factor pertaining to what part of the display should be given more emphasis, such as whether the real world view or the virtual information should be highlighted to capture more of the user's attention. Prominence can be considered when determining many of the features discussed above, such as the degree of transparency, the position of the virtual information, whether to post a watermark notification, and the like.
  • the user dictates prominence. For example, the computer system uses data from tracking the user's eye movement or head movement to determine whether the user wants to concentrate on the real-world view or the virtual information. Depending on the user's focus, the application program will grant more or less prominence to the real world (or virtual information).
  • This analysis allows the system to adjust transparency dynamically. If the user's eye is focusing on virtual objects, then those objects can be given more prominence, or maintain their current prominence without fading due to lack of use. If the user's eye is focusing on the real-world view, the system can cause the virtual world to become more opaque, and occlude less of the real world.
  • the variance of prominence can also be aided by understanding the user's context. By knowing the user's ability and safety, for example, the system can decide whether to permit greater prominence on the virtual world over the real world.
  • the system can decide whether to permit greater prominence on the virtual world over the real world.
  • the user is riding a bus. The user desires the prominence to remain on the virtual world, but would still like the ability to focus temporarily on the real-world view. Brief flicks at the real-world view might be appropriate in this situation.
  • the prominence of the virtual world is diminished in favor of the real world view. This behavior can be configured by the user, or alternatively, the system can track eye focus to dynamically and automatically adjust the visibility of virtual information without occluding too much of the real world.
  • the system may also be configured to respond to eye commands entered via prescribed blinking sequences. For instance, the user's eyes can control prominence of virtual objects via a left-eye blink, or right-eye blink. Then, an opposite eye-blink would give prominence to the real-world view, instead of the virtual-world view. Alternatively, the user can direct the system to give prominence to a specific view by issuing a voice command. The user can tell the system to increase or decrease transparency of the virtual world or virtual objects.
  • the system may further be configured to alter prominence dynamically in response to changes in the user's focus.
  • the system can detect whether the user is looking at a specific virtual object. When the user has not viewed the object within a configurable length of time, the system slowly moves the object away from the center of the user's view, toward the user's peripheral vision.
  • Fig. 6 shows an example of a virtual object in the form of a compass 600 that is initially given prominence at a center position 602 of the display. Here, the user is focusing on the compass to get a bearing before scaling the mountain.
  • the eye tracking feedback is given to the application program, which slowly migrates the compass 600 from its center position to a peripheral location 604 as illustrated by the direction arrow 606. If the user does not stop the object from moving, it will reach the peripheral vision and thus be less of a distraction to the user.
  • the user can stipulate that the virtual object should return and/or remain in place by any one of a variety of methods.
  • Some examples of such stop-methods are: a vocal command, a single long blink of an eye, focusing the eye on a controlling aspect of the object (like a small icon, similar in look to a close-window box on a PC window).
  • Further configurable options from this stopped-state include the system's ability to eventually continue moving the object to the periphery, or instead, the user can lock the object in place (by another command similar to the one that stopped the original movement). At that point, the system no longer attempts to remove the object from the user's main focal area.
  • Marquees are dynamic objects that add prominence beyond static or highlighted borders by flashing, moving (e.g.: cycling) or blinking the border around an object. These are only examples of the variety of ways a system can increase prominence of virtual- world information so the user can more easily notice when the information is overlaid on top of the real- world view.
  • Fig. 7 shows an example of a marquee 700 that scrolls across the display to provide information to the user.
  • the marquee 700 informs the user that their heart rate is reaching an 80% level.
  • Color mapping is another technique to adjust prominence, making virtual information standout or fade into the real-world view.
  • Method Fig. 8 shows processes 800 for operating a transparent UI that integrates virtual information within a real world view in a manner that minimizes distraction to the user.
  • the processes 800 may be implemented in software, or a combination of hardware and software.
  • the operations illustrated as blocks in Fig. 8 may represent computer-executable instructions that, when executed, direct the system to display virtual information and the real world in a certain manner.
  • the application program 146 generates virtual information intended to be displayed on the eyeglass-mounted display.
  • the application program 146 determines how to best present the virtual information (block 804). Factors for such a determination include the importance of the info ⁇ nation, the user's context, immediacy of the information, relevancy of the information to the context, and so on.
  • the transparent UI 148 might initially assign a degree of transparency and a location on the display (block 806). In the case of a notification, the transparent UI 148 might present a faint watermark of a logo or other icon on the screen.
  • the transparent UI 148 might further consider adding a border, or modifying the color of the virtual information, or changing the transparency of the information's background.
  • the system then monitors the user behavior and conditions that gave rise to presentation of the virtual information (block 808). Based on this monitoring or in response to express user commands, the system determines whether a change in transparency or prominence is justified (block 810). If so, the transparent UI modifies the transparency of the virtual information and/or changes its prominence by fading the virtual image out or moving it to a less prominent place on the screen (block 812).

Abstract

A system integrates virtual information with real world images presented on a display, such as a head-mounted display of a wearable computer. The system modifies how the virtual information is presented to alter whether the virtual information is more or less visible relative to the real world images. The modification may be made dynamically, such as in response to a change in the user's context, or user's eye focus on the display, or a user command. The virtual information may be modified in a number of ways, such as adjusting the transparency of the information, modifying the color of the virtual information, enclosing the information in borders, and changing the location of the virtual information on the display. Through these techniques, the system provides the information to the user in a way that minimizes distraction of the user's view of the real world images.

Description

DYNAMIC INTEGRATION OF COMPUTER GENERATED AND REAL
WORLD IMAGES
RELATED APPLICATIONS
A claim of priority is made to U.S. Provisional Application No. 60/240,672, filed Oct 16, 2000, entitled "Method For Dynamic Integration Of Computer Generated And Real World Images", and to U.S. Provisional Application No. 60/240,684, filed Oct 16, 2000, entitled "Methods for Visually Revealing Computer Controls".
TECHNICAL FIELD
The present invention is directed to controlling the appearance of information presented on displays, such as those used in conjunction with wearable personal computers. More particularly, the invention relates to transparent graphical user interfaces that present information transparently on real world images to minimize obstructing the user's view of the real world images.
BACKGROUND OF THE INVENTION
As computers become increasingly powerful and ubiquitous, users increasingly employ their computers for a broad variety of tasks. For example, in addition to traditional activities such as running word processing and database applications, users increasingly rely on their computers as an integral part of their daily lives. Programs to schedule activities, generate reminders, and provide rapid communication capabilities are becoming increasingly popular. Moreover, computers are increasingly present during virtually all of a person's daily activities. For example, hand-held computer organizers (e.g., PDAs) are more common, and communication devices such as portable phones are increasingly incorporating computer capabilities. Thus, users may be presented with output information from one or more computers at any time.
While advances in hardware make computers increasingly ubiquitous, traditional computer programs are not typically designed to efficiently present information to users in a wide variety of environments. For example, most computer programs are designed with a prototypical user being seated at a stationary computer with a large display device, and with the user devoting full attention to the display. In that environment, the computer can safely present information to the user at any time, with minimal risk that the user will fail to perceive the information or that the information will disturb the user in a dangerous manner (e.g., by startling the user while they are using power machinery or by blocking their vision while they are moving with information sent to a head- mounted display). However, in many other environments these assumptions about the prototypical user are not true, and users thus may not perceive output information (e.g., failing to notice an icon or message on a hand-held display device when it is holstered, or failing to hear audio information when in a noisy environment or when intensely concentrating). Similarly, some user activities may have a low degree of interruptibility (i.e., ability to safely interrupt the user) such that the user would prefer that the presentation of low-importance or of all information be deferred, or that information be presented in a non-intrusive manner.
Consider an environment in which the user must be cognizant of the real world surroundings simultaneously with receiving information. Conventional computer systems have attempted to display information to users while also allowing the user to view the real world. However, such systems are unable to display this virtual information without obscuring the real-world view of the user. Virtual information can be displayed to the user, but doing so visually impedes much of the user's view of the real world.
Often the user cannot view the computer-generated information at the same time as the real-world information. Rather, the user is typically forced to switch between the real world and the virtual world by either mentally changing focus or by physically actuating some switching mechanism that alters between displaying the real world and displaying the virtual word. To view the real world, the user must stop looking at the display of virtual information and concentrate on the real world. Conversely, to view the virtual information, the user must stop looking at the real world.
Switching display modes in this way can lead to awkward, or even dangerous, situations that leave the user in transition and sometimes in the wrong mode when they need to deal with an important event. An example of this awkward behavior is found in inadequate current technology of computer displays that are worn by users. Some computer hardware is equipped with an extra piece of hardware that flips down behind the visor display. This effect creates complete background opaqueness when the user needs to view more information, or needs to view it without the distraction of the real-world image.
Accordingly, there is a need for new techniques to display virtual information to a user in a manner that does not disrupt, or disrupts very little, the user's view of the real world.
SUMMARY OF THE INVENTION
A system is provided to integrate computer-generated virtual information with real world images on a display, such as a head-mounted display of a wearable computer. The system presents the virtual information in a way that creates little interference with the user's view of the real world images. The system further modifies how the virtual information is presented to alter whether the virtual information is more or less visible relative to the real world images. The modification may be made dynamically, such as in response to a change in the user's context, or user's eye focus on the display, or a user command.
The virtual information may be modified in a number of ways. In one implementation, the virtual information is presented transparently on the display and overlays the real world images. The user can easily view the real world images through the transparent information. The system can then dynamically adjust the degree of transparency across a range from fully transparent to fully opaque depending upon how noticeable the information is to be displayed.
In another implementation, the system modifies the color of the virtual information to selectively blend or contrast the virtual information with the real world images. Borders may also be drawn around the virtual information to set it apart. Another way to modify presentation is to dynamically move the virtual information on the display to make it more or less prominent for viewing by the user.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 illustrates a wearable computer having a head mounted display and mechanisms for displaying virtual information on the display together with real world images.
Fig. 2 is a diagrammatic illustration of a view of real world images through the head mounted display. The illustration shows a transparent user interface (UI) that presents computer-generated infoimation on the display over the real world images in a manner that minimally distracts the user's vision of the real world images.
Fig. 3 is similar to Fig. 2, but further illustrates a transparent watermark overlaid on the real world images. Fig. 4 is similar to Fig. 2, but further illustrates context specific information depicted relative to the real world images.
Fig. 5 is similar to Fig. 2, but further illustrates a border about the information.
Fig. 6 is similar to Fig. 2, but further illustrates a way to modify prominence of the virtual information by changing its location on the display.
Fig. 7 is similar to Fig. 2, but further illustrates enclosing the information within a marquee.
Fig. 8 shows a process for integrating computer-generated information with real world images on a display.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
Described below is a system and user interface that enables simultaneous display of virtual information and real world information with minimal distraction to the user. The user interface is described in the context of a head mounted visual display (e.g., eye glasses display) of a wearable computing system that allows a user to view the real world while overlaying additional virtual information. However, the user interface may be used for other displays and in contexts other than the wearable computing environment. Exemplary System
Fig. 1 illustrates a body-mounted wearable computer 100 worn by a user 102. The computer 100 includes a variety of body- worn input devices, such as a microphone 1 10, a hand-held flat panel display 112 with character recognition capabilities, and various other user input devices 1 14. Examples of other types of input devices with which a user can supply information to the computer 100 include voice recognition devices, traditional qwerty keyboards, chording keyboards, half qwerty keyboards, dual forearm keyboards, chest mounted keyboards, handwriting recognition and digital ink devices, a mouse, a track pad, a digital stylus, a finger or glove device to capture user movement, pupil tracking devices, a gyropoint, a trackball, a voice grid device, digital cameras (still and motion), and so forth.
The computer 100 also has a variety of body-worn output devices, including the hand-held flat panel display 112, an earpiece speaker 1 16, and a head-mounted display in the form of an eyeglass-mounted display 118. The eyeglass-mounted display 118 is implemented as a display type that allows the user to view real world images from their surroundings while simultaneously overlaying or otherwise presenting computer-generated information to the user in an unobtrusive manner. The display may be constructed to permit direct viewing of real images (i.e., permitting the user to gaze directly through the display at the real world objects) or to show real world images captured from the surroundings by video devices, such as digital cameras. The display and techniques for integrating computer-generated information with the real world surrounding are described below in greater detail. Other output devices 120 may also be incorporated into the computer 100, such as a tactile display, an olfactory output device, tactile output devices, and the like. The computer 100 may also be equipped with one or more various body- worn user sensor devices 122. For example, a variety of sensors can provide information about the current physiological state of the user and current user activities. Examples of such sensors include thermometers, sphygmometers, heart rate sensors, shiver response sensors, skin galvanometry sensors, eyelid blink sensors, pupil dilation detection sensors, EEG and EKG sensors, sensors to detect brow furrowing, blood sugar monitors, etc. In addition, sensors elsewhere in the near environment can provide information about the user, such as motion detector sensors (e.g., whether the user is present and is moving), badge readers, still and video cameras (including low light, infra-red, and x-ray), remote microphones, etc These sensors can be both passive (i.e., detecting information generated external to the sensor, such as a heart beat) or active (i.e., generating a signal to obtain information, such as sonar or x-rays).
The computer 100 may also be equipped with various environment sensor devices 124 that sense conditions of the environment surrounding the user. For example, devices such as microphones or motion sensors may be able to detect whether there are other people near the user and whether the user is interacting with those people. Sensors can also detect environmental conditions that may affect the user, such as air thermometers or geigercounters. Sensors, either body-mounted or remote, can also provide information related to a wide variety of user and environment factors including location, orientation, speed, direction, distance, and proximity to other locations (e.g., GPS and differential GPS devices, orientation tracking devices, gyroscopes, altimeters, accelerometers, anemometers, pedometers, compasses, laser or optical range finders, depth gauges, sonar, etc.). Identity and informational sensors (e.g., bar code readers, biometric scanners, laser scanners, OCR, badge readers, etc.) and remote sensors (e.g., home or car alarm systems, remote camera, national weather service web page, a baby monitor, traffic sensors, etc.) can also provide relevant environment information.
The computer 100 further includes a central computing unit 130 that may or may not be worn on the user. The various inputs, outputs, and sensors are connected to the central computing unit 130 via one or more data communications interfaces 132 that may be implemented using wire-based technologies (e.g., wires, coax, fiber optic, etc.) or wireless technologies (e.g., RF, etc.).
The central computing unit 130 includes a central processing unit (CPU) 140, a memory 142, and a storage device 144. The memory 142 may be implemented using both volatile and non- volatile memory, such as RAM, ROM, Flash, EEPROM, disk, and so forth. The storage device 144 is typically implemented using non-volatile permanent memory, such as ROM, EEPROM, diskette, memory cards, and the like.
One or more application programs 146 are stored in memory 142 and executed by the CPU 140. The application programs 146 generate data that may be output to the user via one or more of the output devices 1 12, 1 16, 118, and 120. For discussion purposes, one particular application program is illustrated with a transparent user interface (UI) component 148 that is designed to present computer- generated information to the user via the eyeglass mounted display 1 18 in a manner that does not distract the user from viewing real world parameters. The transparent UI 148 organizes orientation and presentation of the data and provides the control parameters that direct the display 1 18 to place the data before the user in many different ways that account for such factors as the importance of the information, relevancy to what is being viewed in the real world, and so on. In the illustrated implementation, a Condition-Dependent Output Supplier
(CDOS) system 150 is also shown stored in memory 142. The CDOS system 148 monitors the user and the user's environment, and creates and maintains an updated model of the current condition of the user. As the user moves about in various environments, the CDOS system receives various input infoimation including explicit user input, sensed user information, and sensed environment information. The CDOS system updates the current model of the user condition, and presents output information to the user via appropriate output devices.
Of particular relevance, the CDOS system 150 provides information that might affect how the transparent UI 148 presents the information to the user. For instance, suppose the application program 146 is generating geographical or spatial relevant information that should only be displayed when the user is looking in a specific direction. The CDOS system 150 may be used to generate data indicating where the user is looking. If the user is looking in the correct direction, the transparent UI 148 presents the data in conjunction with the real world view of that direction. If the user turns his/her head, the CDOS system 148 detects the movement and informs the application program 146, enabling the transparent UI 148 to remove the information from the display.
A more detailed explanation of the CDOS system 130 may be found in a co- pending U.S. Patent Application Serial No. 09/216,193, entitled "Method and System For Controlling Presentation of Information To a User Based On The User's Condition", which was filed December 18, 1998, and is commonly assigned to Tangis Corporation. The reader might also be interested in reading U.S. Paten Application Serial No. 09/724,902, entitled "Dynamically Exchanging Computer User's Context", which was filed November 28, 2000, and is commonly assigned to Tangis Corporation. These applications are hereby incorporated by reference. Although not illustrated, the body-mounted computer 100 may be connected to one or more networks of other devices through wired or wireless communication means (e.g., wireless RF, a cellular phone or modem, infrared, physical cable, a docking station, etc.). For example, the body-mounted computer of a user could make use of output devices in a smart room, such as a television and stereo when the user is at home, if the body-mounted computer can transmit information to those devices via a wireless medium or if a cabled or docking mechanism is available to transmit the information. Alternately, kiosks or other information devices can be installed at various locations (e.g., in airports or at tourist spots) to transmit relevant information to body-mounted computers within the range of the information device.
Transparent UI
Fig. 2 shows an exemplary view that the user of the wearable computer 100 might see when looking at the eyeglass mounted display 118. The display 118 depicts a graphical screen presentation 200 generated by the transparent UI 148 of the application program 146 executing on the wearable computer 100. The screen presentation 200 permits viewing of the real world surrounding 202, which is illustrated here as a mountain range.
The transparent screen presentation 200 presents information to the user in a manner that does not significantly impede the user's view of the real world 202. In this example, the virtual information consists of a menu 204 that lists various items of interest to the user. For the mountain-scaling environment, the menu 204 includes context relevant information such as the present temperature, current elevation, and time. The menu 204 may further include navigation items that allow the user to navigate to various levels of information being monitored or stored by the computer 100. Here, the menu items include mapping, email, communication, body parameters, and geographical location. The menu 204 is placed along the side of the display to minimize any distraction from the user's vision of the real world. The menu 204 is presented transparently, enabling the user to see the real world images 202 behind the menu. By making the menu transparent and locating it along the side of the display, the information is available for the user to see, but does not impair the user's view of the mountain range. The transparent UI possesses many features that are directed toward the goal of displaying virtual information to the user without impeding too much of the user's view of the real world. Some of these features are explored below to provide a better understanding of the transparent UI.
Dynamically Changing Degree of Transparency
The transparent UI 148 is capable of dynamically changing the transparency of the virtual information. The application program 146 can change the degree of transparency of the menu 204 (or other virtual objects) by implementing a display range from completely opaque to completely transparent. This display range allows the user to view both real world and virtual-world information at the same time, with dynamic changes being performed for a variety of reasons.
One reason to change the transparency might be the level of importance ascribed to the information. As the information is deemed more important by the application program 146 or user, the transparency is decreased to draw more attention to the information.
Another reason to vary transparency might be context specific. Integrating the transparent UI into a system that models the user's context allows the transparent UI to vary the degree of transparency in response to a rich set of states from the user, their environment, or the computer and its peripheral devices. Using this model, the system can automatically determine what parts of the virtual information to display as more or less transparent and vary their respective transparencies accordingly.
For example, if the information becomes more important in a given context, the application program may decrease the transparency toward the opaque end of the display range to increase the noticeability of the information for the user. Conversely, if the information is less relevant for a given context, the application program may increase the transparency toward the fully transparent end of the display range to dimmish the noticeability of the virtual information.
Another reason to change transparency levels may be due to a change in the user's attention on the real world. For instance, a mapping program may display directional graphics when the user is looking in one direction and fade those graphics out (i.e., make them more transparent) when the user moves his/her head to look in another direction.
Another reason might be the user's focus as detected, for example, by the user's eye movement or focal point. When the user is focused on the real world, the virtual object's transparency increases as the user no longer focuses on the object.
On the other hand, when the user returns their focus to the virtual information, the objects become visibly opaque.
The transparency may further be configured to change over time, allowing the virtual image to fade in and out depending on the circumstances. For example, an unused window can fade from view, becoming very transparent or perhaps eventually fully transparent, when the user maintains their focus elsewhere. The window may then fade back into view when the user attention is returned to it.
Increased transparency generally results in the user being able to see more of the real-world view. In such a configuration, comparatively important virtual objects — like those used for control, status, power, safety, etc. — are the last virtual objects to fade from view. In some configurations, the user may configure the system to never fade specified virtual objects. This type of configuration can be performed dynamically on specific objects or by making changes to a general system configuration. The transparent UI can also be controlled by the user instead of the application program. Examples of this involve a visual target in the user interface that is used to adjust transparency of the virtual objects being presented to the user. For example, this target can be a control button or slider that is controlled by any variety of input methods available to the user (e.g., voice, eye-tracking controls to control the target/control object, keyboard, etc.).
Watermark Notification
The transparent UI 148 may also be configured to present faintly visible notifications with high transparency to hint to the user that additional infoimation is available for presentation. The notification is usually depicted in response to some event about which an application desires to notify the user. The faintly visible notification notifies the user without disrupting the user's concentration on the real world surroundings. The virtual image can be formed by manipulating the real world image, akin to watermarking the digital image in some manner. Fig. 3 shows an example of a watermark notification 300 overlaid on the real world image 202. In this example, the watermark notification 300 is a graphical envelope icon that suggests to the user that new, unread electronic mail has been received. The envelope icon is illustrated in dashed lines around the edge of the full display to demonstrate that the icon is faintly visible (or highly transparent) to avoid obscuring the view of the mountain range. Thus, the user is able to see through the watermark due to its partial transparency, thus helping the user to easily focus on the current task.
The notification may come in many different shapes, positions, and sizes, including a new window, other icon shapes, or some other graphical presentation of information to the user. Like the envelope, the watermark notification can be suggestive of a particular task to orient the user to the task at hand (i.e., read mail).
Depending on a given situation, the application program 146 can decrease the transparency of the information and make it more or less visible. Such information can be used in a variety of situations, such as incoming information, or when more information related to the user's context or user's view (both virtual and real world) is available, or when a reminder is triggered, or anytime more information is available than can be viewed at one time, or for providing "help". Such watermarks can also be used for hinting to the user about advertisements that could be presented to the user. The watermark notification also functions as an active control that may be selected by the user to control an underlying application. When the user looks at the watermark image, or in some other way selects the image, it becomes visibly opaque. The user's method for selecting the image includes any of the various ways a user of a wearable personal computer can perform selections of graphical objects (e.g., blinking, voice selection, etc.). The user can configure this behavior in the system before the commands are given to the system, or generate the system behaviors by commands, controls, or corrections to the system.
Once the user selects the image, the application program provides a suitable response. In the Fig. 3 example, user selection of the envelope icon 300 might cause the email program to display the newly received email message. Context Aware Presentation
The transparent UI may also be configured to present information in different degrees of transparency depending upon the user's context. When the wearable computer 100 is equipped with context aware components (e.g., eye movement sensors, blink detection sensors, head movement sensors, GPS systems, and the like), the application program 146 may be provided with context data that influences how the virtual information is presented to the user via the transparent UI.
Fig. 4 shows one example of presenting virtual information according to the user's context. In particular, this example illustrates a situation where the virtual information is presented to the user only when the user is facing a particular direction. Here, the user is looking toward the mountain range. Virtual information 400 in the form of a climbing aid is overlaid on the display. The climbing aid 400 highlights a desired trail to be taken by the user when scaling the mountain. The trail 400 is visible (i.e., a low degree of transparency) when the user faces in a direction such that the particular mountain is within the viewing area. As the user rotates their head slightly, while keeping the mountain within the viewing area, the trail remains indexed to the appropriate mountain, effectively moving across the screen at the rate of the head rotation. If the user turns their head away from the mountain, the computer 100 will sense that the user is looking in another direction. This data will be input to the application program controlling the trail display and the trail 400 will be removed from the display (or made completely transparent). In this manner, the climbing aid is more intuitive to the user, appearing only when the user is facing the relevant task. This is just one example of modifying the display of virtual information in conjunction with real world surroundings based on the user's context. There are many other situations that may dictate when virtual information is presented or withdrawn depending upon the user's context.
Bordering
Another technique for displaying virtual information to the user without impeding too much of the user's view of the real world is to border the computer- generated information. Borders, or other forms of outlines, are drawn around objects to provide greater control of transparency and opaqueness.
Fig. 5 illustrates the transparent UI 200 where a border 500 is drawn around the menu 204. The border 500 draws a bit more attention to the menu 204 without noticeably distracting from the user's view of the real world 202. Graphical images can be created with special borders embedded in the artwork, such that the borders can be used to highlight the virtual object.
Certain elements of the graphical information, like borders and titles, can also be given different opaque curves relating to visibility. For example, the border 500 might be assigned a different degree of transparency compared to the menu items 204 so that the border 500 would be the last to become fully transparent as the menu's transparency is increased. This behavior leaves the more distinct border 500 visible for the user to identify even after the menu items have been faded to nearly full transparency, thus leaving the impression that the virtual object still exists. This feature also provides a distinct border, which, as long as it is visible, helps the user locate a virtual image, regardless of the transparency of the rest of the image. Moreover, another feature is to group more than one related object (e.g., by drawing boxes about them) to give similar degrees of transparency to a set of objects simultaneously.
Marquees are one embodiment of object borders. Marquees are dynamic objects that add prominence beyond static or highlighted borders by flashing, moving (e.g.: cycling), or blinking the border around an object. These are only examples of the variety of ways a system can highlight virtual information so the user can more easily notice when the information is overlaid on top of the real- world view.
The application program may be configured to automatically detect edges of the display object. The edge information may then be used by the application program to generate object borders dynamically.
Color Changing
Another technique for displaying virtual information in a manner that reduces the user's distraction from viewing of the real world is to change colors of the virtual objects to control their transparency, and hence visibility, against a changing real world view. When a user interface containing virtually displayed information such as program windows, icons, etc. is drawn with colors that clash with, or blend into, the background of real-world colors, the user is unable to properly view the information. To avoid this situation, the application program 146 can be configured to detect conflict of colors and re-map the virtual-world colors so the virtual objects can be easily seen by the user, and so that the virtual colors do not clash with the real-world colors. This color detection and re-mapping makes the virtual objects easier to see and promotes greater control over the transparency of the objects. Where display systems are limited in size and capabilities (e.g., resolution, contrast, etc.), color re-mapping might further involve mapping a current virtual- world color-set to a smaller set of colors. The need for such reduction can be detected automatically by the computer or the user can control all configuration adjustments by directing the computer to perform this action.
Background Transparency
Another technique for presenting virtual information concurrently with the real world images is to manipulate the transparency of the background of the virtual information. In one implementation, the visual backgrounds of virtual information can be dynamically displayed, such that the application program 146 causes the background to become transparent. This allows the user of the system to view more of the real world. By supporting control of the transparent nature of the background of presented information, the application affords greater flexibility to the user for controlling the presentation of transparent information and further aids application developers in providing flexible transparent user interfaces.
Prominence
Another feature provided by the computer system with respect to the transparent UI is the concept of "prominence". Prominence is a factor pertaining to what part of the display should be given more emphasis, such as whether the real world view or the virtual information should be highlighted to capture more of the user's attention. Prominence can be considered when determining many of the features discussed above, such as the degree of transparency, the position of the virtual information, whether to post a watermark notification, and the like. In one implementation, the user dictates prominence. For example, the computer system uses data from tracking the user's eye movement or head movement to determine whether the user wants to concentrate on the real-world view or the virtual information. Depending on the user's focus, the application program will grant more or less prominence to the real world (or virtual information). This analysis allows the system to adjust transparency dynamically. If the user's eye is focusing on virtual objects, then those objects can be given more prominence, or maintain their current prominence without fading due to lack of use. If the user's eye is focusing on the real-world view, the system can cause the virtual world to become more opaque, and occlude less of the real world.
The variance of prominence can also be aided by understanding the user's context. By knowing the user's ability and safety, for example, the system can decide whether to permit greater prominence on the virtual world over the real world. Consider a situation where the user is riding a bus. The user desires the prominence to remain on the virtual world, but would still like the ability to focus temporarily on the real-world view. Brief flicks at the real-world view might be appropriate in this situation. Once the user reaches the destination and leaves the bus, the prominence of the virtual world is diminished in favor of the real world view. This behavior can be configured by the user, or alternatively, the system can track eye focus to dynamically and automatically adjust the visibility of virtual information without occluding too much of the real world. The system may also be configured to respond to eye commands entered via prescribed blinking sequences. For instance, the user's eyes can control prominence of virtual objects via a left-eye blink, or right-eye blink. Then, an opposite eye-blink would give prominence to the real-world view, instead of the virtual-world view. Alternatively, the user can direct the system to give prominence to a specific view by issuing a voice command. The user can tell the system to increase or decrease transparency of the virtual world or virtual objects.
The system may further be configured to alter prominence dynamically in response to changes in the user's focus. Through eye tracking techniques, for example, the system can detect whether the user is looking at a specific virtual object. When the user has not viewed the object within a configurable length of time, the system slowly moves the object away from the center of the user's view, toward the user's peripheral vision. Fig. 6 shows an example of a virtual object in the form of a compass 600 that is initially given prominence at a center position 602 of the display. Here, the user is focusing on the compass to get a bearing before scaling the mountain. When the user returns their attention to the climbing task and focuses once again on the real world 202, the eye tracking feedback is given to the application program, which slowly migrates the compass 600 from its center position to a peripheral location 604 as illustrated by the direction arrow 606. If the user does not stop the object from moving, it will reach the peripheral vision and thus be less of a distraction to the user.
The user can stipulate that the virtual object should return and/or remain in place by any one of a variety of methods. Some examples of such stop-methods are: a vocal command, a single long blink of an eye, focusing the eye on a controlling aspect of the object (like a small icon, similar in look to a close-window box on a PC window). Further configurable options from this stopped-state include the system's ability to eventually continue moving the object to the periphery, or instead, the user can lock the object in place (by another command similar to the one that stopped the original movement). At that point, the system no longer attempts to remove the object from the user's main focal area.
Marquees are dynamic objects that add prominence beyond static or highlighted borders by flashing, moving (e.g.: cycling) or blinking the border around an object. These are only examples of the variety of ways a system can increase prominence of virtual- world information so the user can more easily notice when the information is overlaid on top of the real- world view.
Fig. 7 shows an example of a marquee 700 that scrolls across the display to provide information to the user. In this example, the marquee 700 informs the user that their heart rate is reaching an 80% level.
Color mapping is another technique to adjust prominence, making virtual information standout or fade into the real-world view.
Method Fig. 8 shows processes 800 for operating a transparent UI that integrates virtual information within a real world view in a manner that minimizes distraction to the user. The processes 800 may be implemented in software, or a combination of hardware and software. As such, the operations illustrated as blocks in Fig. 8 may represent computer-executable instructions that, when executed, direct the system to display virtual information and the real world in a certain manner.
At block 802, the application program 146 generates virtual information intended to be displayed on the eyeglass-mounted display. The application program 146, and namely the transparent UI 148, determines how to best present the virtual information (block 804). Factors for such a determination include the importance of the infoπnation, the user's context, immediacy of the information, relevancy of the information to the context, and so on. Based on this information, the transparent UI 148 might initially assign a degree of transparency and a location on the display (block 806). In the case of a notification, the transparent UI 148 might present a faint watermark of a logo or other icon on the screen. The transparent UI 148 might further consider adding a border, or modifying the color of the virtual information, or changing the transparency of the information's background.
The system then monitors the user behavior and conditions that gave rise to presentation of the virtual information (block 808). Based on this monitoring or in response to express user commands, the system determines whether a change in transparency or prominence is justified (block 810). If so, the transparent UI modifies the transparency of the virtual information and/or changes its prominence by fading the virtual image out or moving it to a less prominent place on the screen (block 812).
Conclusion Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or steps described. Rather, the specific features and steps are disclosed as exemplary forms of implementing the claimed invention.

Claims

1. A method comprising: presenting computer-generated infoimation on a display that permits viewing of a real world context; and assigning a degree of transparency to the information to enable display of the information to a user without impeding the user's view of the real world context.
2. A method as recited in claim 0, further comprising dynamically adjusting the degree of transparency of the information.
3. A method as recited in claim 0, further comprising: receiving data pertaining to the user's context; and dynamically adjusting the degree of transparency upon changes in the user's context.
4. A method as recited in claim 0, further comprising: receiving data pertaining to the user's eye focus on the display; and dynamically adjusting the degree of transparency due to change in the user's eye focus.
5. A method as recited in claim 0, further comprising: selecting an initial location on the display to present the information; and subsequently moving the information from the initial location to a second location.
6. A method as recited in claim 0, further comprising presenting a border around the information.
7. A method as recited in claim 0, further comprising presenting the information within a marquee.
8. A method as recited in claim 0, further comprising presenting the information as a faintly visible graphic overlaid on the real world context.
9. A method as recited in claim 0, further comprising modifying a color of the information to alternately blend or distinguish the information from the real world context.
10. A method as recited in claim 0, wherein the information is presented against a background, and further comprising adjusting transparency of the background.
11. A method comprising: presenting information on a screen that permits viewing real images, the information being presented in a first degree of transparency; and modifying presentation of the information to a second degree of transparency.
12. A method as recited in claim 1 1 , wherein the first degree of transparency is more transparent than the second degree of transparency.
13. A method as recited in claim 1 1 , wherein the transparency ranges from fully transparent to fully opaque.
14. A method as recited in claim 1 1 , wherein said modifying is performed in response to change of importance attributed to the information.
15. A method as recited in claim 11, wherein said modifying is performed in response to a user command.
16. A method as recited in claim 11, wherein said modifying is performed in response to a change in user context.
17. A method for operating a display that permits a view of real images, comprising: generating a notification event; and presenting, on the display, a faintly visible virtual object atop the real images to notify a user of the notification event.
18. A method as recited in claim 17, wherein the faintly visible virtual object is transparent.
19. A method for operating a display that permits a view of real images, comprising: monitoring a user's context; and alternately presenting infoimation on the display together with the real images when the user is in a first context and not presenting the infoimation on the display when the user is in a second context.
20. A method as recited in claim 19, wherein the information is presented in an at least partially transparent manner.
21. A method as recited in claim 19, wherein the user's context pertains to geographical location and the information comprises at least one mapping object that provides geographical guidance to the user: the monitoring comprising detecting a direction that the user is facing; and presenting the mapping object when the user is facing a first direction and not presenting the mapping object when the user is facing in a second direction.
22. A method as recited in claim 21, further comprising maintaining the mapping object relative to geographic coordinates so that the mapping object appears to track a particular real image direction relative to a particular real image even though the display is moved relative to the particular real image.
23. A method comprising: presenting a virtual object on a display together with a view of real world surroundings; and graphically depicting the virtual object within a border to visually distinguish the virtual object from the view of the real world surroundings.
24. A method as recited in claim 23, wherein the border comprises a geometrical element that encloses the virtual object.
25. A method as recited in claim 23, wherein the border comprises a marquee.
26. A method as recited in claim 23, further comprising: detecting one or more edges of the virtual object; and dynamically generating the border along the edges.
27. A method as recited in claim 23, further comprising: displaying the virtual object with a first degree of transparency; and displaying the border with a second degree of transparency that is different from the first degree of transparency.
28. A method as recited in claim 23, further comprising: fading out the virtual object at a first rate; fading out the border at a second rate so that the border is visible on the display after the virtual object becomes too faint to view.
29. A method comprising: presenting information on a display that permits a view of real world images; and modifying color of the information to alternately blend or distinguish the information from the real world images.
30. A method as recited in claim 29, wherein the information is at least partially transparent.
31. A method as recited in claim 29, wherein said modifying is performed in response to change in user context.
32. A method as recited in claim 29, wherein said modifying is performed in response to change in user eye focus on the display.
33. A method as recited in claim 29, wherein said modifying is performed in response to change of importance attributed to the information.
34. A method as recited in claim 29, wherein said modifying is performed in response to a user command.
35. A method as recited in claim 29, further comprising presenting a border around the information.
36. A method as recited in claim 29, further comprising presenting the information as a faintly visible graphic overlaid on the real world images.
37. A method for operating a display that permits a view of real world images, comprising: presenting information on the display with a first level of prominence; and modifying the prominence from the first level to a second level.
38. A method as recited in claim 37, wherein said modifying is performed in response to change in user attention between the information and the real world images.
39. A method as recited in claim 37, wherein said modifying is performed in response to change in user context.
40. A method as recited in claim 37, wherein said modifying is performed in response to change of importance attributed to the infoimation.
41. A method as recited in claim 37, wherein said modifying is performed in response to a user command.
42. A method as recited in claim 37, wherein said modifying comprises adjusting transparency of the information.
43. A method as recited in claim 37, wherein said modifying comprises moving the information to another location on the display.
44. A method comprising: presenting a virtual object on a screen together with a view of a real world environment; positioning the virtual object in a first location to entice a user to focus on the virtual object; monitoring the user's focus; and migrating the virtual object to a second location less noticeable than the first location when the user shifts focus from the virtual object to the real world environment.
45. A method comprising: presenting at least one virtual object on a view of real world images; and modifying how the virtual object is presented to alter whether the virtual object is more or less visible relative to the real world images.
46. A method as recited in claim 45, wherein the virtual object is transparent and the modifying comprise changing a degree of transparency.
47. A method as recited in claim 45, wherein the modifying comprises altering a color of the virtual object.
48. A method as recited in claim 45, wherein the modifying comprises changing a location of the virtual object relative to the real world images.
49. A computer comprising: a display that facilitates a view of real world images; a processing unit; and a software module that executes on the processing unit to present a user interface on the display, the user interface presenting infoπnation in a transparent manner to allow a user to view the information without impeding the user's view of the real world images.
50. A computer as recited in claim 49, wherein the software module adjusts transparency within a range from fully transparent to fully opaque.
51. A computer as recited in claim 49, further comprising: context sensors to detect a user's context; and the software module being configured to adjust transparency of the information presented by the user interface in response to changes in the user's context.
52. A computer as recited in claim 49, further comprising: a sensor to detect a user's eye focus; and the software module being configured to adjust transparency of the information presented by the user interface in response to changes in the user's eye focus.
53. A computer as recited in claim 49, wherein the software module is configured to adjust transparency of the information presented by the user interface in response to a user command.
54. A computer as recited in claim 49, wherein the software module moves the information on the display to make the infoimation alternately more or less noticeable.
55. A computer as recited in claim 49, wherein the user interface presents a border around the infoπnation.
56. A computer as recited in claim 49, wherein the user interface presents the information within a marquee.
57. A computer as recited in claim 49, wherein the user interface modifies a color of the information presents to alternately blend or distinguish the information from the real world images.
58. A computer as recited in claim 49, embodied as a wearable computer that can be worn by the user.
59. A computer comprising: a display that facilitates a view of real world images; a processing unit; one or more software programs that execute on the processing unit, at least one of the programs generating an event; and a user interface depicted on the display, where in response to the event, the user interface presents a faintly visible notification overlaid on the real world images to notify the user of the event.
60. A computer as recited in claim 59, wherein the notification is a graphical element.
61. A computer as recited in claim 59, wherein the notification is transparent.
62. A computer as recited in claim 59, embodied as a wearable computer that can be worn by the user.
63. One or more computer-readable media storing computer-executable instructions that, when executed, direct a computer to: display information overlaid on real world images; and present the information transparently to reduce obstructing a view of the real world images.
64. One or more computer- readable media as recited in claim 63, further storing computer-executable instructions that, when executed, direct a computer to dynamically adjust transparency of the transparent infoimation.
65. One or more computer-readable media as recited in claim 63, further storing computer-executable instructions that, when executed, direct a computer to display a border around the information.
66. One or more computer-readable media as recited in claim 63, further storing computer-executable instructions that, when executed, direct a computer to modify a color of the information to alternately blend or contrast the infoimation with the real world images.
67. One or more computer-readable media storing computer-executable instructions that, when executed, direct a computer to: receive a notification event; and in response to the notification event, display a watermark object atop real world images to notify a user of the notification event.
68. One or more computer-readable media storing computer-executable instructions that, when executed, direct a computer to: ascertain a user's context; display information transparently atop a view of real world images; and adjust transparency of the information in response to a change in the user's context.
69. One or more computer-readable media storing computer-executable instructions that, when executed, direct a computer to: display information transparently atop a view of real world images; assign a level of prominence to the information that dictates how prominently the information is displayed to the user; and adjust the level of prominence assigned to the information.
70. A user interface, comprising: at least one virtual object overlaid on a view of real world images, the virtual object being transparent; and a transparency component to dynamically adjust transparency of the virtual object.
71. A user interface as recited in claim 70, wherein the transparency ranges from fully transparent to fully opaque.
72. A system, comprising: means for presenting at least one virtual object on a view of real world images; and means for modifying how the virtual object is presented to alter whether the virtual object is more or less visible relative to the real world images.
73. A system as recited in claim 72, wherein the virtual object is transparent and the modifying means alters a degree of transparency.
74. A system as recited in claim 72, wherein the modifying means alters a color of the virtual object.
75. A system as recited in claim 72, wherein the modifying means alters a location of the virtual object relative to the real world images.
PCT/US2001/031986 2000-10-16 2001-10-15 Dynamic integration of computer generated and real world images WO2002033688A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002211698A AU2002211698A1 (en) 2000-10-16 2001-10-15 Dynamic integration of computer generated and real world images

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US24068400P 2000-10-16 2000-10-16
US24067200P 2000-10-16 2000-10-16
US60/240,684 2000-10-16
US60/240,672 2000-10-16
US09/879,827 US20020044152A1 (en) 2000-10-16 2001-06-11 Dynamic integration of computer generated and real world images
US09/879,827 2001-06-11

Publications (3)

Publication Number Publication Date
WO2002033688A2 true WO2002033688A2 (en) 2002-04-25
WO2002033688A3 WO2002033688A3 (en) 2003-03-27
WO2002033688B1 WO2002033688B1 (en) 2004-04-22

Family

ID=27399380

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/031986 WO2002033688A2 (en) 2000-10-16 2001-10-15 Dynamic integration of computer generated and real world images

Country Status (3)

Country Link
US (1) US20020044152A1 (en)
AU (1) AU2002211698A1 (en)
WO (1) WO2002033688A2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2221654A1 (en) * 2009-02-19 2010-08-25 Thomson Licensing Head mounted display
WO2013154295A1 (en) * 2012-04-08 2013-10-17 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US8670000B2 (en) 2011-09-12 2014-03-11 Google Inc. Optical display system and method with virtual image contrast control
US8823740B1 (en) 2011-08-15 2014-09-02 Google Inc. Display system
CN104305966A (en) * 2014-11-17 2015-01-28 江苏康尚生物医疗科技有限公司 Method and device for setting interface of monitor
US8963954B2 (en) 2010-06-30 2015-02-24 Nokia Corporation Methods, apparatuses and computer program products for providing a constant level of information in augmented reality
CN106066537A (en) * 2015-04-24 2016-11-02 松下电器(美国)知识产权公司 Head mounted display and the control method of head mounted display
US10037084B2 (en) 2014-07-31 2018-07-31 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US11004273B2 (en) 2016-03-29 2021-05-11 Sony Corporation Information processing device and information processing method

Families Citing this family (536)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6791580B1 (en) 1998-12-18 2004-09-14 Tangis Corporation Supplying notifications related to supply and consumption of user context data
US7779015B2 (en) * 1998-12-18 2010-08-17 Microsoft Corporation Logging and analyzing context attributes
US6513046B1 (en) 1999-12-15 2003-01-28 Tangis Corporation Storing and recalling information to augment human memories
US7225229B1 (en) * 1998-12-18 2007-05-29 Tangis Corporation Automated pushing of computer user's context data to clients
US8181113B2 (en) * 1998-12-18 2012-05-15 Microsoft Corporation Mediating conflicts in computer users context data
US8225214B2 (en) 1998-12-18 2012-07-17 Microsoft Corporation Supplying enhanced computer user's context data
US7231439B1 (en) 2000-04-02 2007-06-12 Tangis Corporation Dynamically swapping modules for determining a computer user's context
US9183306B2 (en) 1998-12-18 2015-11-10 Microsoft Technology Licensing, Llc Automated selection of appropriate information based on a computer user's context
US6801223B1 (en) 1998-12-18 2004-10-05 Tangis Corporation Managing interactions between computer users' context models
US6842877B2 (en) 1998-12-18 2005-01-11 Tangis Corporation Contextual responses based on automated learning techniques
US6920616B1 (en) 1998-12-18 2005-07-19 Tangis Corporation Interface for exchanging context data
US7046263B1 (en) 1998-12-18 2006-05-16 Tangis Corporation Requesting computer user's context data
US6999955B1 (en) * 1999-04-20 2006-02-14 Microsoft Corporation Systems and methods for estimating and integrating measures of human cognitive load into the behavior of computational applications and services
US6434527B1 (en) * 1999-05-17 2002-08-13 Microsoft Corporation Signalling and controlling the status of an automatic speech recognition system for use in handsfree conversational dialogue
US7389351B2 (en) * 2001-03-15 2008-06-17 Microsoft Corporation System and method for identifying and establishing preferred modalities or channels for communications based on participants' preferences and contexts
US6931384B1 (en) 1999-06-04 2005-08-16 Microsoft Corporation System and method providing utility-based decision making about clarification dialog given communicative uncertainty
US7103806B1 (en) * 1999-06-04 2006-09-05 Microsoft Corporation System for performing context-sensitive decisions about ideal communication modalities considering information about channel reliability
US6714967B1 (en) 1999-07-30 2004-03-30 Microsoft Corporation Integration of a computer-based message priority system with mobile electronic devices
US7194681B1 (en) * 1999-07-30 2007-03-20 Microsoft Corporation Method for automatically assigning priorities to documents and messages
US6622160B1 (en) 1999-07-30 2003-09-16 Microsoft Corporation Methods for routing items for communications based on a measure of criticality
US8024415B2 (en) 2001-03-16 2011-09-20 Microsoft Corporation Priorities generation and management
US6847924B1 (en) * 2000-06-19 2005-01-25 Ncr Corporation Method and system for aggregating data distribution models
US7634528B2 (en) * 2000-03-16 2009-12-15 Microsoft Corporation Harnessing information about the timing of a user's client-server interactions to enhance messaging and collaboration services
US8701027B2 (en) 2000-03-16 2014-04-15 Microsoft Corporation Scope user interface for displaying the priorities and properties of multiple informational items
US7243130B2 (en) 2000-03-16 2007-07-10 Microsoft Corporation Notification platform architecture
US7743340B2 (en) * 2000-03-16 2010-06-22 Microsoft Corporation Positioning and rendering notification heralds based on user's focus of attention and activity
US7565403B2 (en) * 2000-03-16 2009-07-21 Microsoft Corporation Use of a bulk-email filter within a system for classifying messages for urgency or importance
US7444383B2 (en) * 2000-06-17 2008-10-28 Microsoft Corporation Bounded-deferral policies for guiding the timing of alerting, interaction and communications using local sensory information
AU2001249768A1 (en) * 2000-04-02 2001-10-15 Tangis Corporation Soliciting information based on a computer user's context
US7464153B1 (en) 2000-04-02 2008-12-09 Microsoft Corporation Generating and supplying user context data
US6938024B1 (en) * 2000-05-04 2005-08-30 Microsoft Corporation Transmitting information given constrained resources
US20120105740A1 (en) 2000-06-02 2012-05-03 Oakley, Inc. Eyewear with detachable adjustable electronics module
US8482488B2 (en) 2004-12-22 2013-07-09 Oakley, Inc. Data input management system for wearable electronically enabled interface
US8086672B2 (en) * 2000-06-17 2011-12-27 Microsoft Corporation When-free messaging
US20020054130A1 (en) 2000-10-16 2002-05-09 Abbott Kenneth H. Dynamically displaying current status of tasks
US20030046401A1 (en) * 2000-10-16 2003-03-06 Abbott Kenneth H. Dynamically determing appropriate computer user interfaces
US20120154438A1 (en) * 2000-11-06 2012-06-21 Nant Holdings Ip, Llc Interactivity Via Mobile Image Recognition
US7844666B2 (en) * 2000-12-12 2010-11-30 Microsoft Corporation Controls and displays for acquiring preferences, inspecting behavior, and guiding the learning and decision policies of an adaptive communications prioritization and routing system
US6745193B1 (en) 2001-01-25 2004-06-01 Microsoft Corporation System and method for defining, refining, and personalizing communications policies in a notification platform
US6901398B1 (en) 2001-02-12 2005-05-31 Microsoft Corporation System and method for constructing and personalizing a universal information classifier
SE518484C2 (en) * 2001-02-27 2002-10-15 Peder Holmbom Apparatus and method for disinfecting water for medical or dental purposes
US7330895B1 (en) * 2001-03-15 2008-02-12 Microsoft Corporation Representation, decision models, and user interface for encoding managing preferences, and performing automated decision making about the timing and modalities of interpersonal communications
US6988132B2 (en) * 2001-03-15 2006-01-17 Microsoft Corporation System and method for identifying and establishing preferred modalities or channels for communications based on participants' preferences and contexts
US7251696B1 (en) 2001-03-15 2007-07-31 Microsoft Corporation System and methods enabling a mix of human and automated initiatives in the control of communication policies
US7512940B2 (en) * 2001-03-29 2009-03-31 Microsoft Corporation Methods and apparatus for downloading and/or distributing information and/or software resources based on expected utility
US7757250B1 (en) 2001-04-04 2010-07-13 Microsoft Corporation Time-centric training, inference and user interface for personalized media program guides
US6947935B1 (en) * 2001-04-04 2005-09-20 Microsoft Corporation Training, inference and user interface for guiding the caching of media content on local stores
US7039642B1 (en) 2001-05-04 2006-05-02 Microsoft Corporation Decision-theoretic methods for identifying relevant substructures of a hierarchical file structure to enhance the efficiency of document access, browsing, and storage
US7107254B1 (en) 2001-05-07 2006-09-12 Microsoft Corporation Probablistic models and methods for combining multiple content classifiers
GB2376283B (en) * 2001-06-04 2005-03-16 Hewlett Packard Co Foot activated user input
US7013009B2 (en) 2001-06-21 2006-03-14 Oakley, Inc. Eyeglasses with wireless communication features
US7233933B2 (en) * 2001-06-28 2007-06-19 Microsoft Corporation Methods and architecture for cross-device activity monitoring, reasoning, and visualization for providing status and forecasts of a users' presence and availability
US7689521B2 (en) * 2001-06-28 2010-03-30 Microsoft Corporation Continuous time bayesian network models for predicting users' presence, activities, and component usage
US7089226B1 (en) 2001-06-28 2006-08-08 Microsoft Corporation System, representation, and method providing multilevel information retrieval with clarification dialog
US7409423B2 (en) * 2001-06-28 2008-08-05 Horvitz Eric J Methods for and applications of learning and inferring the periods of time until people are available or unavailable for different forms of communication, collaboration, and information access
US7043506B1 (en) * 2001-06-28 2006-05-09 Microsoft Corporation Utility-based archiving
US7493369B2 (en) * 2001-06-28 2009-02-17 Microsoft Corporation Composable presence and availability services
US7409335B1 (en) * 2001-06-29 2008-08-05 Microsoft Corporation Inferring informational goals and preferred level of detail of answers based on application being employed by the user
US7519529B1 (en) 2001-06-29 2009-04-14 Microsoft Corporation System and methods for inferring informational goals and preferred level of detail of results in response to questions posed to an automated information-retrieval or question-answering service
WO2003021825A1 (en) 2001-08-28 2003-03-13 Sony Corporation Information processing apparatus and method, and recording medium
FR2831978B1 (en) * 2001-11-07 2004-08-20 Neopost Ind POSTAL PRODUCT STATISTICAL MONITORING SYSTEM
US7644144B1 (en) 2001-12-21 2010-01-05 Microsoft Corporation Methods, tools, and interfaces for the dynamic assignment of people to groups to enable enhanced communication and collaboration
US7203909B1 (en) 2002-04-04 2007-04-10 Microsoft Corporation System and methods for constructing personalized context-sensitive portal pages or views by analyzing patterns of users' information access activities
US20030202015A1 (en) * 2002-04-30 2003-10-30 Battles Amy E. Imaging device user interface method and apparatus
US20030212761A1 (en) * 2002-05-10 2003-11-13 Microsoft Corporation Process kernel
US7096432B2 (en) * 2002-05-14 2006-08-22 Microsoft Corporation Write anywhere tool
US20030217098A1 (en) 2002-05-15 2003-11-20 Microsoft Corporation Method and system for supporting the communication of presence information regarding one or more telephony devices
US20030217142A1 (en) 2002-05-15 2003-11-20 Microsoft Corporation Method and system for supporting the communication of presence information regarding one or more telephony devices
US7203635B2 (en) * 2002-06-27 2007-04-10 Microsoft Corporation Layered models for context awareness
US7870240B1 (en) 2002-06-28 2011-01-11 Microsoft Corporation Metadata schema for interpersonal communications management systems
US7069259B2 (en) * 2002-06-28 2006-06-27 Microsoft Corporation Multi-attribute specification of preferences about people, priorities and privacy for guiding messaging and communications
JP4298407B2 (en) * 2002-09-30 2009-07-22 キヤノン株式会社 Video composition apparatus and video composition method
DE10255796A1 (en) * 2002-11-28 2004-06-17 Daimlerchrysler Ag Method and device for operating an optical display device
US7890324B2 (en) * 2002-12-19 2011-02-15 At&T Intellectual Property Ii, L.P. Context-sensitive interface widgets for multi-modal dialog systems
US20040119754A1 (en) * 2002-12-19 2004-06-24 Srinivas Bangalore Context-sensitive interface widgets for multi-modal dialog systems
US20040153445A1 (en) * 2003-02-04 2004-08-05 Horvitz Eric J. Systems and methods for constructing and using models of memorability in computing and communications applications
US8225224B1 (en) 2003-02-25 2012-07-17 Microsoft Corporation Computer desktop use via scaling of displayed objects with shifts to the periphery
US7536650B1 (en) 2003-02-25 2009-05-19 Robertson George G System and method that facilitates computer desktop use via scaling of displayed objects with shifts to the periphery
US8230359B2 (en) * 2003-02-25 2012-07-24 Microsoft Corporation System and method that facilitates computer desktop use via scaling of displayed objects with shifts to the periphery
US7148861B2 (en) * 2003-03-01 2006-12-12 The Boeing Company Systems and methods for providing enhanced vision imaging with decreased latency
US7619626B2 (en) * 2003-03-01 2009-11-17 The Boeing Company Mapping images from one or more sources into an image for display
US7793233B1 (en) 2003-03-12 2010-09-07 Microsoft Corporation System and method for customizing note flags
US7774799B1 (en) 2003-03-26 2010-08-10 Microsoft Corporation System and method for linking page content with a media file and displaying the links
JP3788977B2 (en) * 2003-03-31 2006-06-21 富士通株式会社 Data display device
US7457879B2 (en) 2003-04-01 2008-11-25 Microsoft Corporation Notification platform architecture
US6992625B1 (en) * 2003-04-25 2006-01-31 Microsoft Corporation Calibration of a device location measurement system that utilizes wireless signal strengths
JP4381027B2 (en) 2003-05-02 2009-12-09 パナソニック株式会社 Semiconductor device
US7250955B1 (en) * 2003-06-02 2007-07-31 Microsoft Corporation System for displaying a notification window from completely transparent to intermediate level of opacity as a function of time to indicate an event has occurred
US20040267746A1 (en) * 2003-06-26 2004-12-30 Cezary Marcjan User interface for controlling access to computer objects
US7225187B2 (en) * 2003-06-26 2007-05-29 Microsoft Corporation Systems and methods for performing background queries from content and activity
US7162473B2 (en) * 2003-06-26 2007-01-09 Microsoft Corporation Method and system for usage analyzer that determines user accessed sources, indexes data subsets, and associated metadata, processing implicit queries based on potential interest to users
US20040264677A1 (en) * 2003-06-30 2004-12-30 Horvitz Eric J. Ideal transfer of call handling from automated systems to human operators based on forecasts of automation efficacy and operator load
US7250907B2 (en) * 2003-06-30 2007-07-31 Microsoft Corporation System and methods for determining the location dynamics of a portable computing device
US7444598B2 (en) * 2003-06-30 2008-10-28 Microsoft Corporation Exploded views for providing rich regularized geometric transformations and interaction models on content for viewing, previewing, and interacting with documents, projects, and tasks
US8346587B2 (en) 2003-06-30 2013-01-01 Microsoft Corporation Models and methods for reducing visual complexity and search effort via ideal information abstraction, hiding, and sequencing
US7202816B2 (en) * 2003-07-22 2007-04-10 Microsoft Corporation Utilization of the approximate location of a device determined from ambient signals
US7319877B2 (en) * 2003-07-22 2008-01-15 Microsoft Corporation Methods for determining the approximate location of a device from ambient signals
US7738881B2 (en) * 2003-07-22 2010-06-15 Microsoft Corporation Systems for determining the approximate location of a device from ambient signals
US7454393B2 (en) * 2003-08-06 2008-11-18 Microsoft Corporation Cost-benefit approach to automatically composing answers to questions by extracting information from large unstructured corpora
US20050084082A1 (en) * 2003-10-15 2005-04-21 Microsoft Corporation Designs, interfaces, and policies for systems that enhance communication and minimize disruption by encoding preferences and situations
US7831679B2 (en) * 2003-10-15 2010-11-09 Microsoft Corporation Guiding sensing and preferences for context-sensitive services
US7774349B2 (en) * 2003-12-11 2010-08-10 Microsoft Corporation Statistical models and methods to support the personalization of applications and services via consideration of preference encodings of a community of users
US8159337B2 (en) * 2004-02-23 2012-04-17 At&T Intellectual Property I, L.P. Systems and methods for identification of locations
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices
US7293019B2 (en) 2004-03-02 2007-11-06 Microsoft Corporation Principles and methods for personalizing newsfeeds via an analysis of information novelty and dynamics
JP4522129B2 (en) * 2004-03-31 2010-08-11 キヤノン株式会社 Image processing method and image processing apparatus
US7908663B2 (en) 2004-04-20 2011-03-15 Microsoft Corporation Abstractions and automation for enhanced sharing and collaboration
US7664249B2 (en) * 2004-06-30 2010-02-16 Microsoft Corporation Methods and interfaces for probing and understanding behaviors of alerting and filtering systems based on models and simulation from logs
US20060005146A1 (en) * 2004-07-01 2006-01-05 Arcas Blaise A Y System and method for using selective soft focus as a user interface design element
US20060012183A1 (en) * 2004-07-19 2006-01-19 David Marchiori Rail car door opener
US20060059432A1 (en) * 2004-09-15 2006-03-16 Matthew Bells User interface having viewing area with non-transparent and semi-transparent regions
US7712049B2 (en) 2004-09-30 2010-05-04 Microsoft Corporation Two-dimensional radial user interface for computer software applications
US7788589B2 (en) * 2004-09-30 2010-08-31 Microsoft Corporation Method and system for improved electronic task flagging and management
US20060074883A1 (en) * 2004-10-05 2006-04-06 Microsoft Corporation Systems, methods, and interfaces for providing personalized search and information access
US8677274B2 (en) 2004-11-10 2014-03-18 Apple Inc. Highlighting items for search results
US7519564B2 (en) 2004-11-16 2009-04-14 Microsoft Corporation Building and using predictive models of current and future surprises
US7610560B2 (en) 2004-11-16 2009-10-27 Microsoft Corporation Methods for automated and semiautomated composition of visual sequences, flows, and flyovers based on content and context
US7698055B2 (en) * 2004-11-16 2010-04-13 Microsoft Corporation Traffic forecasting employing modeling and analysis of probabilistic interdependencies and contextual data
US7327245B2 (en) * 2004-11-22 2008-02-05 Microsoft Corporation Sensing and analysis of ambient contextual signals for discriminating between indoor and outdoor locations
US8375434B2 (en) * 2004-12-31 2013-02-12 Ntrepid Corporation System for protecting identity in a network environment
US7689615B2 (en) * 2005-02-25 2010-03-30 Microsoft Corporation Ranking results using multiple nested ranking
US20060206333A1 (en) * 2005-03-08 2006-09-14 Microsoft Corporation Speaker-dependent dialog adaptation
US7885817B2 (en) 2005-03-08 2011-02-08 Microsoft Corporation Easy generation and automatic training of spoken dialog systems using text-to-speech
US7707131B2 (en) * 2005-03-08 2010-04-27 Microsoft Corporation Thompson strategy based online reinforcement learning system for action selection
US7734471B2 (en) * 2005-03-08 2010-06-08 Microsoft Corporation Online learning for dialog systems
US8340476B2 (en) * 2005-03-18 2012-12-25 The Invention Science Fund I, Llc Electronic acquisition of a hand formed expression and a context of the expression
US8599174B2 (en) * 2005-03-18 2013-12-03 The Invention Science Fund I, Llc Verifying a written expression
US8229252B2 (en) * 2005-03-18 2012-07-24 The Invention Science Fund I, Llc Electronic association of a user expression and a context of the expression
US7873243B2 (en) 2005-03-18 2011-01-18 The Invention Science Fund I, Llc Decoding digital information included in a hand-formed expression
US8290313B2 (en) * 2005-03-18 2012-10-16 The Invention Science Fund I, Llc Electronic acquisition of a hand formed expression and a context of the expression
US7791593B2 (en) * 2005-03-18 2010-09-07 The Invention Science Fund I, Llc Machine-differentiatable identifiers having a commonly accepted meaning
US8640959B2 (en) * 2005-03-18 2014-02-04 The Invention Science Fund I, Llc Acquisition of a user expression and a context of the expression
US7809215B2 (en) 2006-10-11 2010-10-05 The Invention Science Fund I, Llc Contextual information encoded in a formed expression
US20060212430A1 (en) 2005-03-18 2006-09-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Outputting a saved hand-formed expression
US20070273674A1 (en) * 2005-03-18 2007-11-29 Searete Llc, A Limited Liability Corporation Machine-differentiatable identifiers having a commonly accepted meaning
US8232979B2 (en) * 2005-05-25 2012-07-31 The Invention Science Fund I, Llc Performing an action with respect to hand-formed expression
US7661069B2 (en) * 2005-03-31 2010-02-09 Microsoft Corporation System and method for visually expressing user interface elements
DE602005013752D1 (en) * 2005-05-03 2009-05-20 Seac02 S R L Augmented reality system with identification of the real marking of the object
US20060253791A1 (en) * 2005-05-03 2006-11-09 Kuiken David P Simplified interactive graphical user interfaces for sorting through a stack of overlapping windows on a display in order along the Z (depth) axis
US7925391B2 (en) * 2005-06-02 2011-04-12 The Boeing Company Systems and methods for remote display of an enhanced image
US20070011109A1 (en) * 2005-06-23 2007-01-11 Microsoft Corporation Immortal information storage and access platform
US7643985B2 (en) * 2005-06-27 2010-01-05 Microsoft Corporation Context-sensitive communication and translation methods for enhanced interactions and understanding among speakers of different languages
US7991607B2 (en) * 2005-06-27 2011-08-02 Microsoft Corporation Translation and capture architecture for output of conversational utterances
US20070005363A1 (en) * 2005-06-29 2007-01-04 Microsoft Corporation Location aware multi-modal multi-lingual device
US7694214B2 (en) * 2005-06-29 2010-04-06 Microsoft Corporation Multimodal note taking, annotation, and gaming
US7428521B2 (en) * 2005-06-29 2008-09-23 Microsoft Corporation Precomputation of context-sensitive policies for automated inquiry and action under uncertainty
US7693817B2 (en) 2005-06-29 2010-04-06 Microsoft Corporation Sensing, storing, indexing, and retrieving data leveraging measures of user activity, attention, and interest
US7529683B2 (en) * 2005-06-29 2009-05-05 Microsoft Corporation Principals and methods for balancing the timeliness of communications and information delivery with the expected cost of interruption via deferral policies
US7460884B2 (en) * 2005-06-29 2008-12-02 Microsoft Corporation Data buddy
US20070004969A1 (en) * 2005-06-29 2007-01-04 Microsoft Corporation Health monitor
US8079079B2 (en) * 2005-06-29 2011-12-13 Microsoft Corporation Multimodal authentication
US7647171B2 (en) * 2005-06-29 2010-01-12 Microsoft Corporation Learning, storing, analyzing, and reasoning about the loss of location-identifying signals
US20070005646A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Analysis of topic dynamics of web search
US7646755B2 (en) * 2005-06-30 2010-01-12 Microsoft Corporation Seamless integration of portable computing devices and desktop computers
US7925995B2 (en) * 2005-06-30 2011-04-12 Microsoft Corporation Integration of location logs, GPS signals, and spatial resources for identifying user activities, goals, and context
US20070005754A1 (en) * 2005-06-30 2007-01-04 Microsoft Corporation Systems and methods for triaging attention for providing awareness of communications session activity
US20070050252A1 (en) * 2005-08-29 2007-03-01 Microsoft Corporation Preview pane for ads
US20070050251A1 (en) * 2005-08-29 2007-03-01 Microsoft Corporation Monetizing a preview pane for ads
JP4890552B2 (en) 2005-08-29 2012-03-07 エブリックス・テクノロジーズ・インコーポレイテッド Interactivity via mobile image recognition
US20070050253A1 (en) * 2005-08-29 2007-03-01 Microsoft Corporation Automatically generating content for presenting in a preview pane for ADS
US20070052672A1 (en) 2005-09-08 2007-03-08 Swisscom Mobile Ag Communication device, system and method
US8024112B2 (en) * 2005-09-29 2011-09-20 Microsoft Corporation Methods for predicting destinations from partial trajectories employing open-and closed-world modeling methods
US20070081123A1 (en) 2005-10-07 2007-04-12 Lewis Scott W Digital eyewear
US11428937B2 (en) * 2005-10-07 2022-08-30 Percept Technologies Enhanced optical and perceptual digital eyewear
US9658473B2 (en) * 2005-10-07 2017-05-23 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
US20070091112A1 (en) * 2005-10-20 2007-04-26 Pfrehm Patrick L Method system and program for time based opacity in plots
US7778632B2 (en) * 2005-10-28 2010-08-17 Microsoft Corporation Multi-modal device capable of automated actions
US7319908B2 (en) * 2005-10-28 2008-01-15 Microsoft Corporation Multi-modal device power/mode management
US20070100704A1 (en) * 2005-10-28 2007-05-03 Microsoft Corporation Shopping assistant
US7467353B2 (en) * 2005-10-28 2008-12-16 Microsoft Corporation Aggregation of multi-modal devices
US20070112906A1 (en) * 2005-11-15 2007-05-17 Microsoft Corporation Infrastructure for multi-modal multilingual communications devices
US20070136222A1 (en) * 2005-12-09 2007-06-14 Microsoft Corporation Question and answer architecture for reasoning and clarifying intentions, goals, and needs from contextual clues and content
US20070136068A1 (en) * 2005-12-09 2007-06-14 Microsoft Corporation Multimodal multilingual devices and applications for enhanced goal-interpretation and translation for service providers
US20070150512A1 (en) * 2005-12-15 2007-06-28 Microsoft Corporation Collaborative meeting assistant
US7747557B2 (en) * 2006-01-05 2010-06-29 Microsoft Corporation Application of metadata to documents and document objects via an operating system user interface
US7797638B2 (en) * 2006-01-05 2010-09-14 Microsoft Corporation Application of metadata to documents and document objects via a software application user interface
US8566894B2 (en) 2006-02-10 2013-10-22 Scott W. Lewis Method and system for distribution of media
US7617164B2 (en) * 2006-03-17 2009-11-10 Microsoft Corporation Efficiency of training for ranking systems based on pairwise training with aggregated gradients
US20070245223A1 (en) * 2006-04-17 2007-10-18 Microsoft Corporation Synchronizing multimedia mobile notes
US20070245229A1 (en) * 2006-04-17 2007-10-18 Microsoft Corporation User experience for multimedia mobile note taking
EP1847963A1 (en) * 2006-04-20 2007-10-24 Koninklijke KPN N.V. Method and system for displaying visual information on a display
US7761464B2 (en) * 2006-06-19 2010-07-20 Microsoft Corporation Diversifying search results for improved search and personalization
US7610151B2 (en) 2006-06-27 2009-10-27 Microsoft Corporation Collaborative route planning for generating personalized and context-sensitive routing recommendations
US7917514B2 (en) * 2006-06-28 2011-03-29 Microsoft Corporation Visual and multi-dimensional search
US8874592B2 (en) 2006-06-28 2014-10-28 Microsoft Corporation Search guided by location and context
US20080005108A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Message mining to enhance ranking of documents for retrieval
US20080005104A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Localized marketing
US7984169B2 (en) * 2006-06-28 2011-07-19 Microsoft Corporation Anonymous and secure network-based interaction
US20080005095A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Validation of computer responses
US20080005069A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Entity-specific search model
US20080005067A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Context-based search, retrieval, and awareness
US20080004948A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Auctioning for video and audio advertising
US20080005068A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Context-based search, retrieval, and awareness
US7739221B2 (en) * 2006-06-28 2010-06-15 Microsoft Corporation Visual and multi-dimensional search
US8788517B2 (en) * 2006-06-28 2014-07-22 Microsoft Corporation Intelligently guiding search based on user dialog
US9396269B2 (en) * 2006-06-28 2016-07-19 Microsoft Technology Licensing, Llc Search engine that identifies and uses social networks in communications, retrieval, and electronic commerce
US20080005074A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Search over designated content
US20080005223A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Reputation data for entities and data processing
US9141704B2 (en) * 2006-06-28 2015-09-22 Microsoft Technology Licensing, Llc Data management in social networks
US20080004990A1 (en) * 2006-06-28 2008-01-03 Microsoft Corporation Virtual spot market for advertisements
US7822762B2 (en) * 2006-06-28 2010-10-26 Microsoft Corporation Entity-specific search model
US8626136B2 (en) * 2006-06-29 2014-01-07 Microsoft Corporation Architecture for user- and context-specific prefetching and caching of information on portable devices
US7552862B2 (en) * 2006-06-29 2009-06-30 Microsoft Corporation User-controlled profile sharing
US20080004884A1 (en) * 2006-06-29 2008-01-03 Microsoft Corporation Employment of offline behavior to display online content
US7873620B2 (en) * 2006-06-29 2011-01-18 Microsoft Corporation Desktop search from mobile device
US20080005079A1 (en) * 2006-06-29 2008-01-03 Microsoft Corporation Scenario-based search
US8725567B2 (en) * 2006-06-29 2014-05-13 Microsoft Corporation Targeted advertising in brick-and-mortar establishments
US20080005313A1 (en) * 2006-06-29 2008-01-03 Microsoft Corporation Using offline activity to enhance online searching
US20080004951A1 (en) * 2006-06-29 2008-01-03 Microsoft Corporation Web-based targeted advertising in a brick-and-mortar retail establishment using online customer information
US8244240B2 (en) * 2006-06-29 2012-08-14 Microsoft Corporation Queries as data for revising and extending a sensor-based location service
US20080005047A1 (en) * 2006-06-29 2008-01-03 Microsoft Corporation Scenario-based search
US7997485B2 (en) 2006-06-29 2011-08-16 Microsoft Corporation Content presentation based on user preferences
US7617042B2 (en) 2006-06-30 2009-11-10 Microsoft Corporation Computing and harnessing inferences about the timing, duration, and nature of motion and cessation of motion with applications to mobile computing and communications
US7739040B2 (en) 2006-06-30 2010-06-15 Microsoft Corporation Computation of travel routes, durations, and plans over multiple contexts
US7797267B2 (en) * 2006-06-30 2010-09-14 Microsoft Corporation Methods and architecture for learning and reasoning in support of context-sensitive reminding, informing, and service facilitation
US20080004954A1 (en) * 2006-06-30 2008-01-03 Microsoft Corporation Methods and architecture for performing client-side directed marketing with caching and local analytics for enhanced privacy and minimal disruption
US8126641B2 (en) * 2006-06-30 2012-02-28 Microsoft Corporation Route planning with contingencies
US7706964B2 (en) * 2006-06-30 2010-04-27 Microsoft Corporation Inferring road speeds for context-sensitive routing
US8112755B2 (en) * 2006-06-30 2012-02-07 Microsoft Corporation Reducing latencies in computing systems using probabilistic and/or decision-theoretic reasoning under scarce memory resources
EP1887526A1 (en) * 2006-08-11 2008-02-13 Seac02 S.r.l. A digitally-augmented reality video system
US20080059904A1 (en) * 2006-08-30 2008-03-06 Christopher Patrick Abbey Method, apparatus, and computer program product for implementing enhanced window focus in a graphical desktop
US7707518B2 (en) * 2006-11-13 2010-04-27 Microsoft Corporation Linking information
US7761785B2 (en) 2006-11-13 2010-07-20 Microsoft Corporation Providing resilient links
US7740353B2 (en) 2006-12-14 2010-06-22 Oakley, Inc. Wearable high resolution audio visual interface
US7711716B2 (en) * 2007-03-06 2010-05-04 Microsoft Corporation Optimizations for a background database consistency check
US20080249667A1 (en) * 2007-04-09 2008-10-09 Microsoft Corporation Learning and reasoning to enhance energy efficiency in transportation systems
US9235262B2 (en) * 2009-05-08 2016-01-12 Kopin Corporation Remote control of host application using motion and voice commands
US8855719B2 (en) * 2009-05-08 2014-10-07 Kopin Corporation Wireless hands-free computing headset with detachable accessories controllable by motion, body gesture and/or vocal commands
US7970721B2 (en) * 2007-06-15 2011-06-28 Microsoft Corporation Learning and reasoning from web projections
US7539659B2 (en) * 2007-06-15 2009-05-26 Microsoft Corporation Multidimensional timeline browsers for broadcast media
US7979252B2 (en) * 2007-06-21 2011-07-12 Microsoft Corporation Selective sampling of user state based on expected utility
US20080320087A1 (en) * 2007-06-22 2008-12-25 Microsoft Corporation Swarm sensing and actuating
US20080319658A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Landmark-based routing
US20080319660A1 (en) * 2007-06-25 2008-12-25 Microsoft Corporation Landmark-based routing
US7912637B2 (en) * 2007-06-25 2011-03-22 Microsoft Corporation Landmark-based routing
US7696866B2 (en) * 2007-06-28 2010-04-13 Microsoft Corporation Learning and reasoning about the context-sensitive reliability of sensors
US8244660B2 (en) 2007-06-28 2012-08-14 Microsoft Corporation Open-world modeling
US7991718B2 (en) * 2007-06-28 2011-08-02 Microsoft Corporation Method and apparatus for generating an inference about a destination of a trip using a combination of open-world modeling and closed world modeling
US7948400B2 (en) * 2007-06-29 2011-05-24 Microsoft Corporation Predictive models of road reliability for traffic sensor configuration and routing
US7673088B2 (en) * 2007-06-29 2010-03-02 Microsoft Corporation Multi-tasking interference model
US8254393B2 (en) * 2007-06-29 2012-08-28 Microsoft Corporation Harnessing predictive models of durations of channel availability for enhanced opportunistic allocation of radio spectrum
DE102007055023B4 (en) 2007-11-15 2023-05-17 Volkswagen Ag Method and device for adapting a user interface in a motor vehicle
WO2009120984A1 (en) 2008-03-28 2009-10-01 Kopin Corporation Handheld wireless display device having high-resolution display suitable for use as a mobile internet device
US20090270694A1 (en) * 2008-04-24 2009-10-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US9649469B2 (en) 2008-04-24 2017-05-16 The Invention Science Fund I Llc Methods and systems for presenting a combination treatment
US9282927B2 (en) 2008-04-24 2016-03-15 Invention Science Fund I, Llc Methods and systems for modifying bioactive agent use
US9560967B2 (en) 2008-04-24 2017-02-07 The Invention Science Fund I Llc Systems and apparatus for measuring a bioactive agent effect
US20100030089A1 (en) * 2008-04-24 2010-02-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US9662391B2 (en) 2008-04-24 2017-05-30 The Invention Science Fund I Llc Side effect ameliorating combination therapeutic products and systems
US20100041964A1 (en) * 2008-04-24 2010-02-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for monitoring and modifying a combination treatment
US9449150B2 (en) 2008-04-24 2016-09-20 The Invention Science Fund I, Llc Combination treatment selection methods and systems
US8026834B2 (en) * 2008-06-09 2011-09-27 Honeywell International Inc. Method and system for operating a display device
US8416152B2 (en) * 2008-06-11 2013-04-09 Honeywell International Inc. Method and system for operating a near-to-eye display
US9846049B2 (en) * 2008-07-09 2017-12-19 Microsoft Technology Licensing, Llc Route prediction
US9207894B2 (en) * 2008-09-19 2015-12-08 Microsoft Technology Licensing, Llc Print preview with page numbering for multiple pages per sheet
US20100088143A1 (en) * 2008-10-07 2010-04-08 Microsoft Corporation Calendar event scheduling
US9480919B2 (en) 2008-10-24 2016-11-01 Excalibur Ip, Llc Reconfiguring reality using a reality overlay device
US8458601B2 (en) * 2008-12-04 2013-06-04 International Business Machines Corporation System and method for item inquiry and information presentation via standard communication paths
JP5136442B2 (en) * 2009-01-27 2013-02-06 ブラザー工業株式会社 Head mounted display
JP5286371B2 (en) * 2009-02-05 2013-09-11 パナソニック株式会社 Information display device and information display method
EP2401865B1 (en) * 2009-02-27 2020-07-15 Foundation Productions, Llc Headset-based telecommunications platform
US8346800B2 (en) * 2009-04-02 2013-01-01 Microsoft Corporation Content-based information retrieval
US8661030B2 (en) 2009-04-09 2014-02-25 Microsoft Corporation Re-ranking top search results
US20100275122A1 (en) * 2009-04-27 2010-10-28 Microsoft Corporation Click-through controller for mobile interaction
WO2010150220A1 (en) 2009-06-25 2010-12-29 Koninklijke Philips Electronics N.V. Method and system for controlling the rendering of at least one media signal
DE102009037835B4 (en) 2009-08-18 2012-12-06 Metaio Gmbh Method for displaying virtual information in a real environment
US8184176B2 (en) * 2009-12-09 2012-05-22 International Business Machines Corporation Digital camera blending and clashing color warning system
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
US8467133B2 (en) 2010-02-28 2013-06-18 Osterhout Group, Inc. See-through display with an optical assembly including a wedge-shaped illumination system
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US8488246B2 (en) 2010-02-28 2013-07-16 Osterhout Group, Inc. See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US8472120B2 (en) 2010-02-28 2013-06-25 Osterhout Group, Inc. See-through near-eye display glasses with a small scale image source
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US8482859B2 (en) 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US8477425B2 (en) 2010-02-28 2013-07-02 Osterhout Group, Inc. See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US8581844B2 (en) * 2010-06-23 2013-11-12 Google Inc. Switching between a first operational mode and a second operational mode using a natural motion gesture
US20110320981A1 (en) * 2010-06-23 2011-12-29 Microsoft Corporation Status-oriented mobile device
US9305263B2 (en) 2010-06-30 2016-04-05 Microsoft Technology Licensing, Llc Combining human and machine intelligence to solve tasks with crowd sourcing
PL391800A1 (en) * 2010-07-12 2012-01-16 Diagnova Technologies Spółka Cywilna Method for virtual presentation of a 3D image and the system for virtual presentation of a 3D image
US20120038663A1 (en) * 2010-08-12 2012-02-16 Harald Gustafsson Composition of a Digital Image for Display on a Transparent Screen
US20120050044A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display with biological state detection
US9111498B2 (en) * 2010-08-25 2015-08-18 Eastman Kodak Company Head-mounted display with environmental state detection
US20120050140A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display control
US20120050142A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display with eye state detection
US8780014B2 (en) * 2010-08-25 2014-07-15 Eastman Kodak Company Switchable head-mounted display
US8619005B2 (en) * 2010-09-09 2013-12-31 Eastman Kodak Company Switchable head-mounted display transition
CN110545366A (en) 2010-09-13 2019-12-06 康道尔知识产权控股有限责任公司 Portable digital video camera configured for remote image acquisition control and viewing
US10013976B2 (en) 2010-09-20 2018-07-03 Kopin Corporation Context sensitive overlays in voice controlled headset computer displays
US9122307B2 (en) 2010-09-20 2015-09-01 Kopin Corporation Advanced remote control of host application using motion and voice commands
US20120069046A1 (en) * 2010-09-22 2012-03-22 Raytheon Company Systems and methods for displaying computer-generated images on a head mounted device
US10036891B2 (en) * 2010-10-12 2018-07-31 DISH Technologies L.L.C. Variable transparency heads up displays
KR101266198B1 (en) * 2010-10-19 2013-05-21 주식회사 팬택 Display apparatus and display method that heighten visibility of augmented reality object
US20120098761A1 (en) * 2010-10-22 2012-04-26 April Slayden Mitchell Display system and method of display for supporting multiple display modes
US20120098971A1 (en) * 2010-10-22 2012-04-26 Flir Systems, Inc. Infrared binocular system with dual diopter adjustment
US9489102B2 (en) * 2010-10-22 2016-11-08 Hewlett-Packard Development Company, L.P. System and method of modifying lighting in a display system
US9164581B2 (en) 2010-10-22 2015-10-20 Hewlett-Packard Development Company, L.P. Augmented reality display system and method of display
US8854802B2 (en) 2010-10-22 2014-10-07 Hewlett-Packard Development Company, L.P. Display with rotatable display screen
US20120113141A1 (en) * 2010-11-09 2012-05-10 Cbs Interactive Inc. Techniques to visualize products using augmented reality
US8594381B2 (en) * 2010-11-17 2013-11-26 Eastman Kodak Company Method of identifying motion sickness
US8565783B2 (en) 2010-11-24 2013-10-22 Microsoft Corporation Path progression matching for indoor positioning systems
US9589254B2 (en) 2010-12-08 2017-03-07 Microsoft Technology Licensing, Llc Using e-mail message characteristics for prioritization
US9134137B2 (en) 2010-12-17 2015-09-15 Microsoft Technology Licensing, Llc Mobile search based on predicted location
US8601380B2 (en) * 2011-03-16 2013-12-03 Nokia Corporation Method and apparatus for displaying interactive preview information in a location-based user interface
US8810598B2 (en) 2011-04-08 2014-08-19 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9163952B2 (en) 2011-04-15 2015-10-20 Microsoft Technology Licensing, Llc Suggestive mapping
US8836771B2 (en) * 2011-04-26 2014-09-16 Echostar Technologies L.L.C. Apparatus, systems and methods for shared viewing experience using head mounted displays
WO2012154938A1 (en) * 2011-05-10 2012-11-15 Kopin Corporation Headset computer that uses motion and voice commands to control information display and remote devices
US9105134B2 (en) 2011-05-24 2015-08-11 International Business Machines Corporation Techniques for visualizing the age of data in an analytics report
US8935301B2 (en) * 2011-05-24 2015-01-13 International Business Machines Corporation Data context selection in business analytics reports
US8749573B2 (en) * 2011-05-26 2014-06-10 Nokia Corporation Method and apparatus for providing input through an apparatus configured to provide for display of an image
US8981995B2 (en) 2011-06-03 2015-03-17 Microsoft Technology Licensing, Llc. Low accuracy positional data by detecting improbable samples
US20120327116A1 (en) * 2011-06-23 2012-12-27 Microsoft Corporation Total field of view classification for head-mounted display
US9470529B2 (en) 2011-07-14 2016-10-18 Microsoft Technology Licensing, Llc Activating and deactivating sensors for dead reckoning
US9464903B2 (en) 2011-07-14 2016-10-11 Microsoft Technology Licensing, Llc Crowd sourcing based on dead reckoning
US8912979B1 (en) 2011-07-14 2014-12-16 Google Inc. Virtual window in head-mounted display
US20130021374A1 (en) * 2011-07-20 2013-01-24 Google Inc. Manipulating And Displaying An Image On A Wearable Computing System
JP5868050B2 (en) * 2011-07-20 2016-02-24 キヤノン株式会社 Display device and control method thereof
US9342610B2 (en) * 2011-08-25 2016-05-17 Microsoft Technology Licensing, Llc Portals: registered objects as virtualized, personalized displays
JP6144681B2 (en) * 2011-08-30 2017-06-07 マイクロソフト テクノロジー ライセンシング,エルエルシー Head mounted display with iris scan profiling function
US8538686B2 (en) 2011-09-09 2013-09-17 Microsoft Corporation Transport-dependent prediction of destinations
CN102508861B (en) * 2011-09-30 2015-09-09 华为技术有限公司 A kind of webpage color setting method, web browser and web page server
US8990682B1 (en) 2011-10-05 2015-03-24 Google Inc. Methods and devices for rendering interactions between virtual and physical objects on a substantially transparent display
US20130088507A1 (en) * 2011-10-06 2013-04-11 Nokia Corporation Method and apparatus for controlling the visual representation of information upon a see-through display
US9081177B2 (en) 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
US10184798B2 (en) 2011-10-28 2019-01-22 Microsoft Technology Licensing, Llc Multi-stage dead reckoning for crowd sourcing
US9547406B1 (en) 2011-10-31 2017-01-17 Google Inc. Velocity-based triggering
KR20130056529A (en) * 2011-11-22 2013-05-30 삼성전자주식회사 Apparatus and method for providing augmented reality service in portable terminal
US20130127908A1 (en) * 2011-11-22 2013-05-23 General Instrument Corporation Method and apparatus for dynamic placement of a graphics display window within an image
US9933620B2 (en) 2011-12-06 2018-04-03 E-Vision Smart Optics, Inc. Eye-mounted display system and method for providing images
JP6243112B2 (en) * 2011-12-09 2017-12-06 ソニー株式会社 Information processing apparatus, information processing method, and recording medium
US9429657B2 (en) 2011-12-14 2016-08-30 Microsoft Technology Licensing, Llc Power efficient activation of a device movement sensor module
US8775337B2 (en) 2011-12-19 2014-07-08 Microsoft Corporation Virtual sensor development
US9223138B2 (en) 2011-12-23 2015-12-29 Microsoft Technology Licensing, Llc Pixel opacity for augmented reality
WO2013101438A1 (en) 2011-12-29 2013-07-04 Kopin Corporation Wireless hands-free computing head mounted video eyewear for local/remote diagnosis and repair
US9213185B1 (en) * 2012-01-06 2015-12-15 Google Inc. Display scaling based on movement of a head-mounted display
US9606586B2 (en) 2012-01-23 2017-03-28 Microsoft Technology Licensing, Llc Heat transfer device
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
US9297996B2 (en) 2012-02-15 2016-03-29 Microsoft Technology Licensing, Llc Laser illumination scanning
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US20130246967A1 (en) * 2012-03-15 2013-09-19 Google Inc. Head-Tracked User Interaction with Graphical Interface
US8947322B1 (en) * 2012-03-19 2015-02-03 Google Inc. Context detection and context-based user-interface population
US11068049B2 (en) * 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US8957916B1 (en) * 2012-03-23 2015-02-17 Google Inc. Display method
JP6066037B2 (en) * 2012-03-27 2017-01-25 セイコーエプソン株式会社 Head-mounted display device
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US8756002B2 (en) * 2012-04-17 2014-06-17 Nokia Corporation Method and apparatus for conditional provisioning of position-related information
JP6289448B2 (en) 2012-04-25 2018-03-07 コピン コーポレーション Instant translation system
US9595137B2 (en) * 2012-04-26 2017-03-14 Intel Corporation Augmented reality computing device, apparatus and system
US20130293530A1 (en) * 2012-05-04 2013-11-07 Kathryn Stone Perez Product augmentation and advertising in see through displays
US20130300635A1 (en) * 2012-05-09 2013-11-14 Nokia Corporation Method and apparatus for providing focus correction of displayed information
US20130300634A1 (en) * 2012-05-09 2013-11-14 Nokia Corporation Method and apparatus for determining representations of displayed information based on focus distance
US9442290B2 (en) 2012-05-10 2016-09-13 Kopin Corporation Headset computer operation using vehicle sensor feedback for remote control vehicle
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US8989535B2 (en) 2012-06-04 2015-03-24 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US9824601B2 (en) 2012-06-12 2017-11-21 Dassault Systemes Symbiotic helper
US9219901B2 (en) 2012-06-19 2015-12-22 Qualcomm Incorporated Reactive user interface for head-mounted display
US9339726B2 (en) * 2012-06-29 2016-05-17 Nokia Technologies Oy Method and apparatus for modifying the presentation of information based on the visual complexity of environment information
KR101986218B1 (en) * 2012-08-02 2019-06-05 삼성전자주식회사 Apparatus and method for display
US9823745B1 (en) 2012-08-30 2017-11-21 Atheer, Inc. Method and apparatus for selectively presenting content
US9142185B2 (en) 2012-08-30 2015-09-22 Atheer, Inc. Method and apparatus for selectively presenting content
WO2014033306A1 (en) * 2012-09-03 2014-03-06 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Head mounted system and method to compute and render a stream of digital images using a head mounted system
US9817125B2 (en) 2012-09-07 2017-11-14 Microsoft Technology Licensing, Llc Estimating and predicting structures proximate to a mobile device
DE102012216057A1 (en) * 2012-09-11 2014-05-28 Bayerische Motoren Werke Aktiengesellschaft Arrange displays in a head-mounted display
US10269179B2 (en) 2012-10-05 2019-04-23 Elwha Llc Displaying second augmentations that are based on registered first augmentations
US9077647B2 (en) 2012-10-05 2015-07-07 Elwha Llc Correlating user reactions with augmentations displayed through augmented views
US9141188B2 (en) 2012-10-05 2015-09-22 Elwha Llc Presenting an augmented view in response to acquisition of data inferring user activity
US9111383B2 (en) 2012-10-05 2015-08-18 Elwha Llc Systems and methods for obtaining and using augmentation data and for sharing usage data
US10180715B2 (en) 2012-10-05 2019-01-15 Elwha Llc Correlating user reaction with at least an aspect associated with an augmentation of an augmented view
US10713846B2 (en) * 2012-10-05 2020-07-14 Elwha Llc Systems and methods for sharing augmentation data
KR20140045801A (en) * 2012-10-09 2014-04-17 삼성전자주식회사 Transparent display apparatus and controlling method thereof
US9639235B2 (en) * 2012-11-01 2017-05-02 Baker Hughes Incorporated Selection of borehole and well data for visualization
US9619911B2 (en) 2012-11-13 2017-04-11 Qualcomm Incorporated Modifying virtual object display properties
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
JP6134004B2 (en) * 2012-12-06 2017-05-24 イービジョン スマート オプティクス インコーポレイテッド Systems, devices, and / or methods for providing images
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US9812046B2 (en) * 2013-01-10 2017-11-07 Microsoft Technology Licensing, Llc Mixed reality display accommodation
KR102051656B1 (en) * 2013-01-22 2019-12-03 삼성전자주식회사 Transparent display apparatus and method thereof
US9791921B2 (en) 2013-02-19 2017-10-17 Microsoft Technology Licensing, Llc Context-aware augmented reality object commands
US9301085B2 (en) 2013-02-20 2016-03-29 Kopin Corporation Computer headset with detachable 4G radio
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US9566509B2 (en) * 2013-03-12 2017-02-14 Disney Enterprises, Inc. Adaptive rendered environments using user context
US10109075B2 (en) 2013-03-15 2018-10-23 Elwha Llc Temporal element restoration in augmented reality systems
US10025486B2 (en) 2013-03-15 2018-07-17 Elwha Llc Cross-reality select, drag, and drop for augmented reality systems
EP2973533A4 (en) 2013-03-15 2016-11-30 Oakley Inc Electronic ornamentation for eyewear
US11181740B1 (en) 2013-03-15 2021-11-23 Percept Technologies Inc Digital eyewear procedures related to dry eyes
US9639964B2 (en) 2013-03-15 2017-05-02 Elwha Llc Dynamically preserving scene elements in augmented reality systems
DE102013207063A1 (en) 2013-04-19 2014-10-23 Bayerische Motoren Werke Aktiengesellschaft A method of selecting an information source from a plurality of information sources for display on a display of data glasses
DE102013207064A1 (en) 2013-04-19 2014-10-23 Bayerische Motoren Werke Aktiengesellschaft Method for selecting an information source for display on data glasses
US9383819B2 (en) * 2013-06-03 2016-07-05 Daqri, Llc Manipulation of virtual object in augmented reality via intent
US9354702B2 (en) * 2013-06-03 2016-05-31 Daqri, Llc Manipulation of virtual object in augmented reality via thought
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
WO2014201213A1 (en) 2013-06-12 2014-12-18 Oakley, Inc. Modular heads-up display system
US9235051B2 (en) 2013-06-18 2016-01-12 Microsoft Technology Licensing, Llc Multi-space connected virtual data objects
JP6252002B2 (en) * 2013-07-11 2017-12-27 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP6375662B2 (en) * 2014-03-27 2018-08-22 セイコーエプソン株式会社 Head-mounted display device
TW201502581A (en) * 2013-07-11 2015-01-16 Seiko Epson Corp Head mounted display device and control method for head mounted display device
JP6095781B2 (en) * 2013-07-18 2017-03-15 三菱電機株式会社 Information presenting apparatus and information presenting method
GB2517143A (en) * 2013-08-07 2015-02-18 Nokia Corp Apparatus, method, computer program and system for a near eye display
KR102108066B1 (en) * 2013-09-02 2020-05-08 엘지전자 주식회사 Head mounted display device and method for controlling the same
US10761566B2 (en) * 2013-09-27 2020-09-01 Beijing Lenovo Software Ltd. Electronic apparatus and method for processing information
US10318100B2 (en) 2013-10-16 2019-06-11 Atheer, Inc. Method and apparatus for addressing obstruction in an interface
US9582516B2 (en) 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
KR102170749B1 (en) * 2013-11-29 2020-10-28 삼성전자주식회사 Electro device comprising transparent display and method for controlling thereof
US9576188B2 (en) * 2013-12-23 2017-02-21 Atheer, Inc. Method and apparatus for subject identification
US9626801B2 (en) * 2013-12-31 2017-04-18 Daqri, Llc Visualization of physical characteristics in augmented reality
FR3016448B1 (en) * 2014-01-15 2017-05-26 Dassault Aviat AIRCRAFT INFORMATION DISPLAY SYSTEM AND ASSOCIATED METHOD
US20150228119A1 (en) 2014-02-11 2015-08-13 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9442631B1 (en) 2014-01-27 2016-09-13 Google Inc. Methods and systems for hands-free browsing in a wearable computing device
US9135849B2 (en) * 2014-01-31 2015-09-15 International Business Machines Corporation Variable operating mode HMD application management based upon crowd determined distraction
CN106030692B (en) * 2014-02-20 2019-11-15 索尼公司 Display control unit, display control method and computer program
FR3020880B1 (en) * 2014-05-09 2016-05-27 Thales Sa VISUAL HEAD COMPRISING AN OPTICAL MIXER WITH EXPANSION OF PUPIL PILOTABLE
KR102209511B1 (en) 2014-05-12 2021-01-29 엘지전자 주식회사 Wearable glass-type device and method of controlling the device
US9600743B2 (en) 2014-06-27 2017-03-21 International Business Machines Corporation Directing field of vision based on personal interests
US9865089B2 (en) 2014-07-25 2018-01-09 Microsoft Technology Licensing, Llc Virtual reality environment with real world objects
US10416760B2 (en) 2014-07-25 2019-09-17 Microsoft Technology Licensing, Llc Gaze-based object placement within a virtual reality environment
US10451875B2 (en) 2014-07-25 2019-10-22 Microsoft Technology Licensing, Llc Smart transparency for virtual objects
US9858720B2 (en) 2014-07-25 2018-01-02 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport
US10311638B2 (en) 2014-07-25 2019-06-04 Microsoft Technology Licensing, Llc Anti-trip when immersed in a virtual reality environment
US9766460B2 (en) 2014-07-25 2017-09-19 Microsoft Technology Licensing, Llc Ground plane adjustment in a virtual reality environment
US9904055B2 (en) 2014-07-25 2018-02-27 Microsoft Technology Licensing, Llc Smart placement of virtual objects to stay in the field of view of a head mounted display
US9304235B2 (en) 2014-07-30 2016-04-05 Microsoft Technology Licensing, Llc Microfabrication
US9965030B2 (en) * 2014-07-31 2018-05-08 Samsung Electronics Co., Ltd. Wearable glasses and method of displaying image via the wearable glasses
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US9946361B2 (en) * 2014-08-14 2018-04-17 Qualcomm Incorporated Management for wearable display
GB201414609D0 (en) * 2014-08-18 2014-10-01 Tosas Bautista Martin Systems and methods for dealing with augmented reality overlay issues
US9471837B2 (en) * 2014-08-19 2016-10-18 International Business Machines Corporation Real-time analytics to identify visual objects of interest
US10067561B2 (en) * 2014-09-22 2018-09-04 Facebook, Inc. Display visibility based on eye convergence
US9910518B2 (en) * 2014-10-01 2018-03-06 Rockwell Automation Technologies, Inc. Transparency augmented industrial automation display
US10140768B2 (en) * 2014-10-17 2018-11-27 Seiko Epson Corporation Head mounted display, method of controlling head mounted display, and computer program
JP6358038B2 (en) * 2014-10-17 2018-07-18 セイコーエプソン株式会社 Head-mounted display device, method for controlling head-mounted display device, computer program
WO2016071244A2 (en) 2014-11-06 2016-05-12 Koninklijke Philips N.V. Method and system of communication for use in hospitals
EP3023863A1 (en) * 2014-11-20 2016-05-25 Thomson Licensing Device and method for processing visual data, and related computer program product
GB2532954A (en) * 2014-12-02 2016-06-08 Ibm Display control system for an augmented reality display system
US20160170206A1 (en) * 2014-12-12 2016-06-16 Lenovo (Singapore) Pte. Ltd. Glass opacity shift based on determined characteristics
KR20170095885A (en) * 2014-12-22 2017-08-23 에씰로아 인터내셔날(콩파니에 제네랄 도프티크) A method for adapting the sensorial output mode of a sensorial output device to a user
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
US20160239985A1 (en) * 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
US9940521B2 (en) * 2015-02-27 2018-04-10 Sony Corporation Visibility enhancement devices, systems, and methods
JP6766062B2 (en) * 2015-03-17 2020-10-07 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Systems and methods for on-screen identification of instruments in remote-controlled medical systems
EP3283907A4 (en) 2015-04-15 2018-05-02 Razer (Asia-Pacific) Pte. Ltd. Filtering devices and filtering methods
US20160314621A1 (en) 2015-04-27 2016-10-27 David M. Hill Mixed environment display of attached data
US10007413B2 (en) 2015-04-27 2018-06-26 Microsoft Technology Licensing, Llc Mixed environment display of attached control elements
US9898865B2 (en) * 2015-06-22 2018-02-20 Microsoft Technology Licensing, Llc System and method for spawning drawing surfaces
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
US10003749B1 (en) * 2015-07-01 2018-06-19 Steven Mark Audette Apparatus and method for cloaked outdoor electronic signage
KR20170005602A (en) * 2015-07-06 2017-01-16 삼성전자주식회사 Method for providing an integrated Augmented Reality and Virtual Reality and Electronic device using the same
US20170103574A1 (en) * 2015-10-13 2017-04-13 Google Inc. System and method for providing continuity between real world movement and movement in a virtual/augmented reality experience
US20170132845A1 (en) * 2015-11-10 2017-05-11 Dirty Sky Games, LLC System and Method for Reducing Virtual Reality Simulation Sickness
US20170153698A1 (en) * 2015-11-30 2017-06-01 Nokia Technologies Oy Method and apparatus for providing a view window within a virtual reality scene
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10387719B2 (en) * 2016-05-20 2019-08-20 Daqri, Llc Biometric based false input detection for a wearable computing device
CN107436491A (en) * 2016-05-26 2017-12-05 华冠通讯(江苏)有限公司 The threat caution system and its threat alarming method for power of virtual reality display device
US9870064B2 (en) 2016-06-13 2018-01-16 Rouslan Lyubomirov DIMITROV System and method for blended reality user interface and gesture control system
JPWO2018003650A1 (en) * 2016-06-29 2019-05-30 日本精機株式会社 Head-up display
JP6723895B2 (en) * 2016-10-25 2020-07-15 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US10212157B2 (en) 2016-11-16 2019-02-19 Bank Of America Corporation Facilitating digital data transfers using augmented reality display devices
US10158634B2 (en) 2016-11-16 2018-12-18 Bank Of America Corporation Remote document execution and network transfer using augmented reality display devices
US10943229B2 (en) 2016-11-29 2021-03-09 Bank Of America Corporation Augmented reality headset and digital wallet
US10600111B2 (en) 2016-11-30 2020-03-24 Bank Of America Corporation Geolocation notifications using augmented reality user devices
US10339583B2 (en) 2016-11-30 2019-07-02 Bank Of America Corporation Object recognition and analysis using augmented reality user devices
US10685386B2 (en) 2016-11-30 2020-06-16 Bank Of America Corporation Virtual assessments using augmented reality user devices
US10586220B2 (en) 2016-12-02 2020-03-10 Bank Of America Corporation Augmented reality dynamic authentication
US10481862B2 (en) 2016-12-02 2019-11-19 Bank Of America Corporation Facilitating network security analysis using virtual reality display devices
US10607230B2 (en) 2016-12-02 2020-03-31 Bank Of America Corporation Augmented reality dynamic authentication for electronic transactions
US10311223B2 (en) 2016-12-02 2019-06-04 Bank Of America Corporation Virtual reality dynamic authentication
US10109096B2 (en) 2016-12-08 2018-10-23 Bank Of America Corporation Facilitating dynamic across-network location determination using augmented reality display devices
US10109095B2 (en) 2016-12-08 2018-10-23 Bank Of America Corporation Facilitating dynamic across-network location determination using augmented reality display devices
US10217375B2 (en) 2016-12-13 2019-02-26 Bank Of America Corporation Virtual behavior training using augmented reality user devices
US10210767B2 (en) 2016-12-13 2019-02-19 Bank Of America Corporation Real world gamification using augmented reality user devices
US11507216B2 (en) 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
US10936872B2 (en) 2016-12-23 2021-03-02 Realwear, Inc. Hands-free contextually aware object interaction for wearable display
US10393312B2 (en) 2016-12-23 2019-08-27 Realwear, Inc. Articulating components for a head-mounted display
US10437070B2 (en) 2016-12-23 2019-10-08 Realwear, Inc. Interchangeable optics for a head-mounted display
US10620910B2 (en) * 2016-12-23 2020-04-14 Realwear, Inc. Hands-free navigation of touch-based operating systems
JP2018137505A (en) * 2017-02-20 2018-08-30 セイコーエプソン株式会社 Display device and control method thereof
US10169973B2 (en) 2017-03-08 2019-01-01 International Business Machines Corporation Discontinuing display of virtual content and providing alerts based on hazardous physical obstructions
US10747386B2 (en) * 2017-06-01 2020-08-18 Samsung Electronics Co., Ltd. Systems and methods for window control in virtual reality environment
US10460527B2 (en) * 2017-06-30 2019-10-29 Tobii Ab Systems and methods for displaying images in a virtual world environment
US10691945B2 (en) 2017-07-14 2020-06-23 International Business Machines Corporation Altering virtual content based on the presence of hazardous physical obstructions
US10803642B2 (en) * 2017-08-18 2020-10-13 Adobe Inc. Collaborative virtual reality anti-nausea and video streaming techniques
DK180470B1 (en) 2017-08-31 2021-05-06 Apple Inc Systems, procedures, and graphical user interfaces for interacting with augmented and virtual reality environments
US11132053B2 (en) * 2017-09-28 2021-09-28 Apple Inc. Method and device for surfacing physical environment interactions during simulated reality sessions
DK201870346A1 (en) 2018-01-24 2019-09-12 Apple Inc. Devices, Methods, and Graphical User Interfaces for System-Wide Behavior for 3D Models
US10818086B2 (en) * 2018-02-09 2020-10-27 Lenovo (Singapore) Pte. Ltd. Augmented reality content characteristic adjustment
US11145096B2 (en) 2018-03-07 2021-10-12 Samsung Electronics Co., Ltd. System and method for augmented reality interaction
US20190385372A1 (en) * 2018-06-15 2019-12-19 Microsoft Technology Licensing, Llc Positioning a virtual reality passthrough region at a known distance
US10921595B2 (en) * 2018-06-29 2021-02-16 International Business Machines Corporation Contextual adjustment to augmented reality glasses
TWI675583B (en) 2018-07-23 2019-10-21 緯創資通股份有限公司 Augmented reality system and color compensation method thereof
CA3120516A1 (en) 2018-11-19 2020-05-28 E-Vision Smart Optics, Inc. Beam steering devices
US11340758B1 (en) * 2018-12-27 2022-05-24 Meta Platforms, Inc. Systems and methods for distributing content
US10845842B2 (en) * 2019-03-29 2020-11-24 Lenovo (Singapore) Pte. Ltd. Systems and methods for presentation of input elements based on direction to a user
US11846783B2 (en) * 2019-05-17 2023-12-19 Sony Group Corporation Information processing apparatus, information processing method, and program
FR3098932B1 (en) * 2019-07-15 2023-12-22 Airbus Helicopters Method and system for assisting the piloting of an aircraft by adaptive display on a screen
US11256855B2 (en) * 2019-08-09 2022-02-22 Zave IP, LLC Systems and methods for collation of digital content
JP7240996B2 (en) * 2019-09-18 2023-03-16 株式会社トプコン Surveying system and surveying method using eyewear device
JP7330507B2 (en) * 2019-12-13 2023-08-22 株式会社Agama-X Information processing device, program and method
KR20210137340A (en) * 2020-05-08 2021-11-17 삼성디스플레이 주식회사 Display device
AU2021349382B2 (en) * 2020-09-25 2023-06-29 Apple Inc. Methods for adjusting and/or controlling immersion associated with user interfaces
KR20220091160A (en) 2020-12-23 2022-06-30 삼성전자주식회사 Augmented reality device and method for operating the same
WO2022146696A1 (en) * 2021-01-04 2022-07-07 Rovi Guides, Inc. Methods and systems for controlling media content presentation on a smart glasses display
US11747622B2 (en) * 2021-01-04 2023-09-05 Rovi Guides, Inc. Methods and systems for controlling media content presentation on a smart glasses display
US11906737B2 (en) * 2021-01-04 2024-02-20 Rovi Guides, Inc. Methods and systems for controlling media content presentation on a smart glasses display
US11796801B2 (en) * 2021-05-24 2023-10-24 Google Llc Reducing light leakage via external gaze detection
US11808945B2 (en) * 2021-09-07 2023-11-07 Meta Platforms Technologies, Llc Eye data and operation of head mounted device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742263A (en) * 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
WO2000052539A1 (en) * 1999-03-02 2000-09-08 Siemens Aktiengesellschaft Augmented reality system for situation-related support for interaction between a user and a technical device
EP1182541A2 (en) * 2000-08-22 2002-02-27 Siemens Aktiengesellschaft System and method for combined use of different display/apparatus types with system controlled context dependant information representation

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6417969B1 (en) * 1988-07-01 2002-07-09 Deluca Michael Multiple viewer headset display apparatus and method with second person icon display
US5579026A (en) * 1993-05-14 1996-11-26 Olympus Optical Co., Ltd. Image display apparatus of head mounted type
US5903395A (en) * 1994-08-31 1999-05-11 I-O Display Systems Llc Personal visual display system
US5742264A (en) * 1995-01-24 1998-04-21 Matsushita Electric Industrial Co., Ltd. Head-mounted display
US5886822A (en) * 1996-10-08 1999-03-23 The Microoptical Corporation Image combining system for eyeglasses and face masks
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
US6166744A (en) * 1997-11-26 2000-12-26 Pathfinder Systems, Inc. System for combining virtual images with real-world scenes
US6097353A (en) * 1998-01-20 2000-08-01 University Of Washington Augmented retinal display with view tracking and data positioning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742263A (en) * 1995-12-18 1998-04-21 Telxon Corporation Head tracking system for a head mounted display system
WO2000052539A1 (en) * 1999-03-02 2000-09-08 Siemens Aktiengesellschaft Augmented reality system for situation-related support for interaction between a user and a technical device
EP1182541A2 (en) * 2000-08-22 2002-02-27 Siemens Aktiengesellschaft System and method for combined use of different display/apparatus types with system controlled context dependant information representation

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2221654A1 (en) * 2009-02-19 2010-08-25 Thomson Licensing Head mounted display
US8963954B2 (en) 2010-06-30 2015-02-24 Nokia Corporation Methods, apparatuses and computer program products for providing a constant level of information in augmented reality
US8823740B1 (en) 2011-08-15 2014-09-02 Google Inc. Display system
US8670000B2 (en) 2011-09-12 2014-03-11 Google Inc. Optical display system and method with virtual image contrast control
CN104272371A (en) * 2012-04-08 2015-01-07 三星电子株式会社 Transparent display apparatus and method thereof
WO2013154295A1 (en) * 2012-04-08 2013-10-17 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US9958957B2 (en) 2012-04-08 2018-05-01 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US10732729B2 (en) 2012-04-08 2020-08-04 Samsung Electronics Co., Ltd. Transparent display apparatus and method thereof
US10037084B2 (en) 2014-07-31 2018-07-31 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10452152B2 (en) 2014-07-31 2019-10-22 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US10725556B2 (en) 2014-07-31 2020-07-28 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US11150738B2 (en) 2014-07-31 2021-10-19 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
CN104305966A (en) * 2014-11-17 2015-01-28 江苏康尚生物医疗科技有限公司 Method and device for setting interface of monitor
CN106066537A (en) * 2015-04-24 2016-11-02 松下电器(美国)知识产权公司 Head mounted display and the control method of head mounted display
US11004273B2 (en) 2016-03-29 2021-05-11 Sony Corporation Information processing device and information processing method

Also Published As

Publication number Publication date
US20020044152A1 (en) 2002-04-18
AU2002211698A1 (en) 2002-04-29
WO2002033688B1 (en) 2004-04-22
WO2002033688A3 (en) 2003-03-27

Similar Documents

Publication Publication Date Title
US20020044152A1 (en) Dynamic integration of computer generated and real world images
CN111399734B (en) User interface camera effects
CN109739361B (en) Visibility improvement method based on eye tracking and electronic device
US20180018792A1 (en) Method and system for representing and interacting with augmented reality content
EP2887238B1 (en) Mobile terminal and method for controlling the same
US8643951B1 (en) Graphical menu and interaction therewith through a viewing window
ES2535364T3 (en) Eye control of computer equipment
JP4927631B2 (en) Display device, control method therefor, program, recording medium, and integrated circuit
CN104380237B (en) Reactive user interface for head-mounted display
CN110058759B (en) Display device and image display method
US20130176250A1 (en) Mobile terminal and control method thereof
US20040095311A1 (en) Body-centric virtual interactive apparatus and method
CN111448542B (en) Display application
US20160132189A1 (en) Method of controlling the display of images and electronic device adapted to the same
JP7005161B2 (en) Electronic devices and their control methods
US20210303107A1 (en) Devices, methods, and graphical user interfaces for gaze-based navigation
WO2019187487A1 (en) Information processing device, information processing method, and program
US20210117048A1 (en) Adaptive assistive technology techniques for computing devices
CN108369451B (en) Information processing apparatus, information processing method, and computer-readable storage medium
JP2017182247A (en) Information processing device, information processing method, and program
US10585485B1 (en) Controlling content zoom level based on user head movement
US20190155560A1 (en) Multi-display control apparatus and method thereof
US20220301264A1 (en) Devices, methods, and graphical user interfaces for maps
KR102312601B1 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
CN115004129A (en) Eye-based activation and tool selection system and method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: COMMUNICATION NOT DELIVERED NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69(1) EPC (EPO FORM 1205A DATED 01.12.03)

B Later publication of amended claims

Effective date: 20030212

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP