WO2008061372A1 - Digital signage system with wireless displays - Google Patents

Digital signage system with wireless displays Download PDF

Info

Publication number
WO2008061372A1
WO2008061372A1 PCT/CA2007/002116 CA2007002116W WO2008061372A1 WO 2008061372 A1 WO2008061372 A1 WO 2008061372A1 CA 2007002116 W CA2007002116 W CA 2007002116W WO 2008061372 A1 WO2008061372 A1 WO 2008061372A1
Authority
WO
WIPO (PCT)
Prior art keywords
player
site controller
players
frame
content
Prior art date
Application number
PCT/CA2007/002116
Other languages
French (fr)
Inventor
Marc Boscher
Original Assignee
Digicharm Communications Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Digicharm Communications Inc. filed Critical Digicharm Communications Inc.
Publication of WO2008061372A1 publication Critical patent/WO2008061372A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41415Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance involving a public display, viewable by several users in a public space outside their home, e.g. movie theatre, information kiosk
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4882Data services, e.g. news ticker for displaying messages, e.g. warnings, reminders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/488Data services, e.g. news ticker
    • H04N21/4886Data services, e.g. news ticker for displaying a ticker, e.g. scrolling banner for news, stock exchange, weather data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/812Monomedia components thereof involving advertisement data
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/06Remotely controlled electronic signs other than labels

Definitions

  • the present invention relates to devices for displaying graphical content and, in particular, to large scale Digital Signage systems.
  • Conventional Digital Signage (DS) systems include large, full-color off-the-shelf displays capable of displaying visual content received from a local server or the Internet.
  • Conventional Digital Signage displays support most industry-standard content formats such as MPEG, Quicktime, Flash, Web Pages, PAL, NTSC, HDTV, etc, decoding and adapting the content at playback time, which requires additional processing power.
  • the high cost of these large high-quality displays requires significant initial deployment investment and may prevent large scale deployments.
  • An object of the present invention is to overcome the shortcomings of the prior art and provide a cost-effective scalable DS system having multiple wireless displays.
  • the present invention relates to a system for rendering and displaying a plurality of frame sequences
  • a core system comprising: a core instruction memory storing a core system instruction set for converting images into the plurality of frame sequences; a descriptor memory for storing one or more device descriptors; and a core system processing means for executing the core system instruction set and forming the plurality of frame sequences.
  • the system further comprises a plurality of RF transmitters, each comprising first communication means for communicating with the core system over a fiber, cable or wire link; and second communication means for wirelessly transmitting the plurality of frame sequences to a plurality of players; each player comprising: an RF receiver for receiving one of the plurality of frame sequences; a buffer for storing said frame sequence; and a display for displaying said frame sequence; wherein each player is associated with one of the one or more device descriptors; and each of the plurality of frame sequences received by said player has a format dependent on parameters of said device descriptor; so that said player does not perform any step of: resizing, padding, color conversion, and gamma-correction on said frame sequence.
  • Another aspect of the present invention relates to a system comprising memory storing design instructions for providing a dynamic image file including one or more placeholder fields for receiving values, query instructions for providing the values for the one or more placeholder fields, and a final converter instructions for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor, as well as processing means for executing these instructions.
  • Another aspect of the present invention relates to a system comprising a plurality of players, wherein each of the players comprises an IEEE 802.15.4 receiver, processing means consisting of a 8-bit microcontroller, and an OLED display with resolution not higher than 200 x 100.
  • Figure 1 is a block scheme of the method of creating and displaying graphical content in accordance with the instant invention
  • Figure 2 is a schematic illustration of data transformations within one frame
  • Figure 3 is an illustration to a rendering process
  • Figure 4 is a block scheme of a system for creating and displaying a plurality of frame sequences in accordance with the instant invention
  • Figure 5 illustrates components of a core system instruction set
  • Figure 6 is a schematic representation of distributed system in accordance with the instant invention.
  • FIG. 7 is a schematic representation of WLAN
  • FIG. 8 is a schematic representation of the gateway 320
  • FIG. 9 is a schematic diagram of communication protocols used in WLAN
  • Figure 10 is a block schema of a method of displaying content at a player in accordance with one embodiment of the instant invention.
  • Table 1 is a representation of the rendering process.
  • One application of the instant invention is a retail DS system for presenting customers with product related advertisement and price information, including at least 40 content players. Typically, it is a multi-store system having as many as 20,000 store shelf displays, or players, at a single retail location. This system will be used for illustration purposes throughout the instant application, however the invention is not limited to such systems.
  • FIG. 1 presents method steps
  • FIG. 2 illustrates transformations of data within one frame.
  • a content choosing step 110 graphical content for displaying at a DS player is chosen in an iterative process which includes selecting variable content and permanent content, and displaying a resulting image for evaluation.
  • the permanent content includes such assets as images, video, animation, and text, as well as content meant for another media, such as television or the Internet.
  • the permanent content at least in part, is chosen from an asset store within the DS system or is imported.
  • the permanent graphic content or its part is created using any editor for the particular system, for example by a professional graphic artist.
  • the variable content includes dynamic variables, such as a product price.
  • the dynamic variables can be associated with products, product categories, customers, or entire networks.
  • a data source for providing value(s) for a dynamic variable for example a retail backend database, may be identified at this step.
  • a dynamic variable can be parameterized by specifying its data type, maximum length, or dimensions of the space available on display.
  • a resulting frame or frame sequence, combining the permanent content and the variable content, is then displayed to simulate, as close as possible, the experience of a user viewing the DS player, wherein default values are displayed for not yet known dynamic variables.
  • a device descriptor associated with the DS player is used at this step so that the frame or frame sequence is created and optimized specifically for the particular player model; the process of creating the frame sequence ready for viewing is described in more detail later in reference to steps 120 and 140.
  • the device descriptor includes a set of parameters related to a target player, such as display resolution, color depth, physical size, storage capacity, color palette, supported functionalities. In the instance of two players having different parameters, more than one device descriptors may be employed.
  • the chosen assets are preferably adapted to the right color depth and resolution, wherein the conversion can be optimized using visual tools to provide the best output.
  • One or more project files result from the content choosing step 110.
  • the project files include the chosen assets, or references to them, the chosen dynamic variables, and, optionally, the device descriptor and additional effects and transitions.
  • some of the chosen assets within the project file are adapted using the device descriptor as described hereinbefore.
  • a dynamic image file creating step 120 all the frame data which is available, such as the permanent content and the additional effects and transitions, is rendered, or converted, into a format that can be played back without decoding by a player, also referred to herein as a player specific format and raw data format.
  • the dynamic content is stored as placeholders with information on which dynamic variables to use and how to format them. Dependent on the parameters in the device descriptor, this partial formatting may significantly reduce content size by removing unused asset data.
  • the project file is transformed into a dynamic image file (DIG) including one or more placeholder fields for receiving values.
  • DIG dynamic image file
  • the dynamic image file is structured so to contain various information blocks describing the content and its location within a frame.
  • the DIG Format is structured in that it contains various information blocks describing the content and its frame timing.
  • any frame data within a dig file is player-model specific.
  • a raw frame data can be described as a sequence of bytes that can be sent directly from the player's storage unit to the video controller of the display without analysis or processing. More specifically, the data itself is: a set of color indexes representing pixels, a sequencing order for these indexes, a way of packing this sequence of indexes in bytes, where the color indexes, sequencing order, and packing method are defined by and are specific to the display or its video controller.
  • a frame 220 illustrates adaptation of the image 200 using a player's device descriptor, which specifies, by way of example, a black and white M x N display.
  • the upper K rows of the frame 220 contain the image 200 converted to K x N pixels, and the bottom (M - K) rows are reserved for the value of the dynamic variable 215.
  • each black or white pixel is represented by a single bit, and a first bit sequence 25Oi represents the first row in the frame 220, and the last bit sequence 250 ⁇ represents the last row in the frame 220.
  • the placeholder 240 contains information related to the dynamic variable 215.
  • the rendering process consists of 4 steps, each step working with one layer of the content.
  • a Screen Layer is a parent layer upon which other layers are drawn; it includes the background area is within the bounds of the actual display. Any part of a layer that is outside the bounds of the screen layer will not be visible. The actual size of this layer in pixels is determined by the target display.
  • the screen layer is in screen space which has coordinates (0,0) to (device resolution width - 1 , device resolution height - 1 ).
  • An Image Layer contains the asset/source image/animation/video drawn onto the screen layer using an affine transformation defined by a view. Varying this view changes the position of the image on screen and can create animations.
  • the view is automatically generated at creation such that the image is scaled to fit completely within the screen space. This means the image can fill the screen horizontally and/or vertically, but always at least in one dimension, i.e. there may be empty borders around the image.
  • the image layer is in image space which has coordinates (0,0) to (image width - 1, image height - 1).
  • a Text Layer a sequence of text labels drawn on top of the image layer but relative to the screen layer using an affine transformation defined by a view. Varying this view changes the position of the text on screen and can create animations. Note that the (0,0) coordinate of a text label is the left point on the text's baseline (i.e. the bottom left corner of the bounding rectangle).
  • a View layer gives the set of parameters that specify the scaling and translation transformations that must be applied to a source, for example one chosen from the asset store 205, when placed in a target space. Scaling is applied to the source's dimensions directly, but the translation specified by the minX and minY properties is relative to the target space. In other words, the results of applying a View is typically to scale the source first, and then place it at point (minX, minY) in the target space.
  • the result will be a scaled source image with size 80x60 positioned at (10, 20) in target space.
  • FIG. 3 provides an example of Text Labels drawn into screen space.
  • Table 1 lists the steps of the rendering pipeline.
  • the creation of a dynamic image file from a project file includes color reduction of all project assets to a player-specific color palette, for example in the RGB or RGBa color space, and encoding individual frames or frame portions to the raw data format compatible with the video controller of the target display.
  • assets in the form of text, images or video are adapted from their original format and color depth to that of the player, using a standard color reduction technique, such as a nearest neighbor algorithm or an error diffusion algorithm with a Floyd-Steinberg filter disclosed, for example, in U.S. Patent Nos. 6,844,882 issued Jan. 18, 2005 to Clauson, 7,171,045 issued Jan. 30, 2007 to Hamilton, and 6,201,612 issued Mar. 13, 2001 to Matsushiro et al., and a player-dependent palette, by way of example, a palette with 16 shades of one color or a 8-bit gray scale palette.
  • the palette is indexed by attaching an index to a color, wherein the index is typically a 1 , 2, 4 or 8 bit value.
  • the player is configured to exhibit a gamma curve as smooth as possible by plotting electrical characteristics of the display for each color index and adjusting the palette so that the difference in apparent intensity as perceived by the human eye between any two consecutive color indices is constant. Then each color of the palette is converted into the RGB metric.
  • this technique eliminates the need for gamma correction on the player
  • each pixel has an RGB value available in the palette of the target player.
  • a frame converts the image data to the raw data format by converting RGB values into color indexes, followed by a byte packing algorithm that aligns the color indexes within a byte stream. The result is a raw byte sequence for each frame that can be streamed directly to the display's video controller.
  • the same algorithm of encoding assets into DIG format is used in the content choosing step 110 for simulating the experience of a user viewing the DS player.
  • values of dynamic variables to fill the placeholder fields in the dynamic image file are obtained, preferably by querying a database 245, which can be a part of the DS system or an external database such as s retail backend database.
  • the current values of dynamic variables are provided by a central distribution system or by a local backend system such as a store back office.
  • the values are provided manually.
  • the DS system synchronizes the displayed values with the database so as to change the displayed content as some of the dynamic variables change their values stored in the database.
  • a creating static image file step 140 after the dynamic image file and the values obtained in the query step 130 are combined into a static image file (SDIG) containing a frame sequence having the player specific format dependent on the parameters of the device descriptor, so that the resulting static file can be played back by the player without decoding, as will be discussed further in this specification.
  • SDIG static image file
  • All placeholders are substituted with the values obtained in the query step 130 converted to pixel data, as described above in reference to the dynamic file creation step 120, and merged with pre-formatted frames. This process occurs every time a dynamic DIG file changes or every time a value of a dynamic variable changes. In the instance of a dynamic variable used by several DIG files, following the change of this variable all the files are updated and sent to the corresponding players.
  • the static image file contains frame data in a device-specific way, in a format dependent on parameters of the device descriptor, and does not require any decoding at the player. From the very beginning, content is created for a target player by a content creation tool described later.
  • the frame data is a sequence of bytes which can be sent directly from the player's memory to the video controller of the display without analysis or processing, such as decompression, resizing, padding, color conversion, gamma- correction, etc.
  • the data itself is a set of color indexes representing pixels, sequenced for being sent directly from a buffer to the video controller of the display, and packed into bytes, wherein the color indexes, sequencing order, and packing method are defined by and are specific to the player.
  • the static image file 260 includes bit sequences 25Oi - 250 ⁇ copied from the dynamic file 230, and bit sequences 250 ⁇ +i - 250 M containing the value 247 of the dynamic variable 215 in the pixel form.
  • a step of defining distribution parameters 150 defines playback schedule, target product, geographical and other parameters.
  • the defining distribution parameters step 150 is optional: in the instance of a DS system providing the same frame sequence to all the DS players having the same device descriptor, the step of defining distribution parameters 150 is not necessary. Choice of the DS distribution parameters denoted by step 150 can be performed at any time within the timeframe of the method until the moment these parameters are used in step 160.
  • the distribution step 160 is for distributing content and schedules over the Internet or any IP-based network to multiple individual site controllers, each managing multiple players at a single location, such as a store.
  • the site controllers will be discussed in more detail further in this specification.
  • the content is distributed in the form of dynamic image files including one or more placeholder fields for receiving values, whereas each site controller provides the values independently; relative to FIG. 1, the distribution step 160 is performed after the step of creating a dynamic file 120 and before the query step 130.
  • the content is distributed in the form of static image files wherein all the placeholders are already substituted with queried values, so that the distribution step 160 is performed after the creating static file step 140. In a very small configuration of the system including only one site controller performing the steps of content creation 110-140, the distribution step 160 is not performed.
  • the values of dynamic variables can also be transmitted as part of the distribution process if these values are imported, received, or synchronized by the distribution system.
  • the transmission of content and dynamic values is usually independent since they change separately and asynchronously.
  • a content delivery step 170 includes transmitting static image files over a wireless LAN (WLAN) from the site controllers to multiple players.
  • the static image files are compressed before the transmission.
  • the content is loaded in accordance with the campaign schedule defined at the step 150 or after the static image file has been updated. Content delivery, or load, is initiated by the site controller.
  • a player Upon receiving a static image file containing a frame sequence over the WLAN, a player stores this content in local, non- volatile memory, in a content storing step 180. This process can include decompression that occurs only once per transmission but is performed after transmission has completed.
  • a microprocessor within the player reads its local nonvolatile memory and sends the frame data directly to the display's video controller. No processing is performed on the frame data, in particular, no steps of decompression, resizing, padding, color conversion, and gamma-correction are performed on the data.
  • FIG. 2 shows image 270 displayed by the player.
  • a system for creating and displaying a plurality of frame sequences shown in FIG. 4 includes a core system 300, a plurality of RF transmitters, and a plurality of players, wherein only one RF transmitter 320 and one player 301 are shown.
  • the core system 300 creates graphical content in the form of frame sequences and wirelessly loads it into the players 301 for displaying.
  • the core system 300 includes memory 305 having at least two parts: a core instruction memory 306 storing a core system instruction set for converting images into the plurality of frame sequences, the asset store 205, and a descriptor memory 307 for storing one or more device descriptors associated with the players 301 so that each player 301 is associated with one device descriptor.
  • the examples of memory components 305 and 307 include random access memory (RAM), non-volatile memory such as a read-only memory (ROM), flash memory, Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), etc., and a disk storage device.
  • a disk storage device can include any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), a DVD, a DVD+RW, and the like.
  • the core system 300 also includes a core system processing means 310 such as one or more processors, microprocessors, controllers, and the like, for executing the core system instruction set and forming the plurality of frame sequences.
  • a core system processing means 310 such as one or more processors, microprocessors, controllers, and the like, for executing the core system instruction set and forming the plurality of frame sequences.
  • the core system instruction set 400 stored in the core instruction memory 306 includes an image converter 401 for converting an image into a part of a frame for displaying on a screen of the player 301.
  • the image converter 401 executed by the processing means 310 is employed in the dynamic image file creating step 120 and the static image file creating step 140.
  • the converter image 401 converts the upper part of the screen 210 during the dynamic image file creating step 120, and the bottom part of the screen 220 during the static image file creating step
  • the image converter 401 includes: first instructions 410 for providing frame pixels based on the image, wherein a number of the frame pixels is equal to a number of screen pixels in a corresponding part of the display screen; and second instructions 420 implementing one of the color reduction algorithm discussed relevant to the step 120, for reducing the number of colors of the image not to exceed the color resolution of the player 301 and for color-indexing the image pixels; wherein the number of screen pixels, the number of colors, and the palette of the player 301 are obtained from the device descriptor associated with the player 301; so that the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction, when displaying the frame sequence.
  • the core system instruction set 400 includes design instructions 430 for providing a dynamic image file including one or more placeholder fields for receiving values, executed at the creating dynamic image file step 120; query instructions 440 for providing the values for the placeholder fields, executed at the query step 130; and a final converter 450 for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor, executed at the creating static image file step 140.
  • the aforedescribed converter instructions 400 are called by the designer 430 and the final converter 450.
  • the core system instruction set 400 includes campaign manager 460 for defining DS distribution parameters, and supporting scheduled delivery of frame sequences to the players.
  • the method illustrated by FIG. 1 is simplified to disallow use of dynamic variables, therefore to exclude the creating dynamic file step 120 and query step 130.
  • the core instruction set 400 does not include the design instructions 430 and the query instructions 440.
  • the core system 300 can reside on one computer or be a distributed system, wherein parts of the system 300 communicate over a network.
  • the core system 300 is a distributed system 30O 1 consisting of an animator 460, a distribution portal 470, and one or more site controllers 480 spatially separated from the rest of the system 30O 1.
  • the animator 460 is a content creation tool for micro-advertisement and micro- promotions designed as a creative tool for non-professionals, and an adaptation and conversion tool for content creation professional.
  • the animator 460 is for executing at least a part of the design instructions to perform the content choosing step 110, and, optionally, the steps 120 -150 of creating the dynamic file, querying at least some of the dynamic variables, creating the static image file if all the dynamic variables are found, and defining the distribution parameters.
  • the animator 460 is the MicroAnimator software implemented by DigiCharm Inc.
  • the distribution portal 470 is for performing the distribution step 160 and providing centralized management of all players 301 on all sites, regrouped by customer, region, store, product, etc., as well as advertisement and promotion campaign management, including scheduling and regional targeting.
  • the distribution portal 470 includes the database
  • the distribution portal 470 may be an external application integrated into the architecture shown in FIG. 6 by interfacing with the animator 460 and the site controllers 480. Most existing Content Distribution systems can be used, for example such as disclosed in U.S. Patent No. 6785704.
  • the animator 460 and the distribution portal 470 form together a design station, either distributed or residing in one location.
  • the design station has the design instructions 430 stored in a design station memory and a design station interface for providing the dynamic image file or the static image file over a network to the site controllers 480, such as an Ethernet card or chip, a wireless transceiver, or any other interface.
  • the query instructions and the final converter can be stored at the design station or at the site controller.
  • Each site controller 480 is an embedded PC or small server running a software application that manages an entire location, such as a store.
  • the site controller 480 interfaces with the distribution system 470 to receive and store content and schedules for a single site.
  • the site controller 480 has first communication means for communicating with the design station over the network, including an interface, such as an Ethernet card or chip, a wireless transceiver, or any other interface, and software such as a TCP/IP stack.
  • the site controller 480 also has second communication means for communicating with at least one of the RF transmitters, located separately from the site controller, over a fiber, cable, wire link, or the like, which is preferably an USB interface, and can be an Ethernet or any other interface.
  • different parts of the two communication means are software instructions providing different addressing and different treatment of received data.
  • the site controller 480 is responsible for finalizing the static image file, if it has not been done yet by other components of the system, and, for this purpose, receives the dynamic image file and the remaining dynamic variables from the distribution system 470.
  • the dynamic variables are queried from the database 245, or from a local backend system such as the store back office, and/or the site controller 480 provides an interface for manual input of values for the dynamic variables. After values for all the dynamic variables are provided, the site controller converts the dynamic image file into the static image file.
  • the site controller 480 distributes the static image file(s) to the players 301 based on the schedules received from the distribution system 470, manages and monitors the players 301 within the location.
  • the site controller For converting a plurality of frame sequences in a player-specific format, the site controller has the design instruction, the query instructions, and the final converter instructions stored in the memory of the site controller and processing means for executing these instructions, as well as a device descriptor memory for storing one or more device descriptors, so that each player is associated with one device descriptor; and each frame sequence received by the player 301 has a format dependent on parameters of the device descriptor associated with the player 301; so that the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction on said frame sequence.
  • the site controller 480 has at least one port for communicating with the animator 460 and/or the distribution portal 470 over a network, and at least one port for communicating with at least one of the RF transmitters, located separately from the site controller, over the fiber, cable, or wire link, by way of example an USB port.
  • the site controller 480 is connected to a conventional Digital Signage infrastructure and provides distribution of the content and schedules supplied by the infrastructure to the players 301 and reports playback statistics and errors to the central system.
  • the site controller 480 includes an RF transmitter for communication with the players 301.
  • the site controller 480 controls a wireless local access network (WLAN) 490 formed by the plurality of RF transmitters 320 for providing wirelessly connection between the site controller 480 and the players 301.
  • the WLAN 490 has a two-layer star topology shown in FIG. 7, wherein one or more RF transmitters 320, also referred to as gateways 320, are connected to the site controller 480. Since each wireless hop has a high impact on total throughput, the WLAN 490 has wireless hops 485 only between the gateway 320 and the players 301.
  • Each gateway 320 is a combination of hardware and firmware supporting and bridging two communication protocols.
  • the gateway 320 includes two types of communication means: a first interface 710 for communicating with the site controller 480 which is a part of the core system 300, over a fiber, cable or wire link 495; and a second interface 720 for wirelessly transmitting the plurality of frame sequences to the players 301.
  • the first interface 710 can be a USB interface, or an Ethernet card or chip, or the like, and the second interface 720 is a wireless transceiver.
  • Processing means 730 are for protocol translation between wireline and wireless connections.
  • the WLAN 490 supports a wireless standard IEEE 802.15.4 and is segmented into sub-networks or zones each having a radius of approximately 10 meters and a number of players ranging between 20 and 1000, dependent of the geometry of the location and the size and renewal rate of the content, wherein the gateway 320 is disposed at the center of the zone.
  • the numbers of players in each zone should be balanced across the zones.
  • the gateway 320 supports a wireless standard IEEE 802.15.4 transmits using one or more channels selected out of the 16 channels. In order to reduce interference, the channels are selected so as to avoid having two adjacent or overlapping zones communicating on the same channel.
  • all the gateways 320 support IEEE 802.15.4 standard for communication with the plurality of players.
  • the site controller 480 includes means for dynamically managing sub-networks formed of the plurality of players in proximity to each of the RF transmitters connected to the site controller 480, implemented as a network management software.
  • the site controller 480 manages all the sub-networks, including discovery of devices in the network, maintains adjacency lists specifying communication distances between two devices on the network and dynamically forms sub-networks.
  • the site controller 480 monitors the local WLAN 490, collects playback statistics for the location, and provides the playback statistics and monitoring information including any detected errors to the distribution system 470.
  • the site controller 480 provides load balancing distributing the content among the gateways 320 to achieve on wireless links 485 as high as possible "link quality" defined by the IEEE 802.15.4 standard, so that the WLAN 490 is a network having the site controller 480, a plurality of gateways 320 each connected over the fiber, cable, or wire link 495 to the site controller 480, and a plurality of displays 301, each wirelessly connected using the IEEE 802.15.4 protocol to one of the gateways 320, wherein at least one of the displays 301 is a dual homed display 301 1 so it can receive wireless signals from two of the gateways 32O 1 and 32O 2 , and wherein the site controller performs load balancing by sending messages to the dual homed display 301 j via a less busy of the two gateways 32O 2 .
  • the site controller 480 provides authentication of players using an Access Control List (ACL), message integrity, and sequential freshness to avoid replay attacks.
  • ACL Access Control List
  • the site controller 480 implements a protocol stack 610 shown in FIG. 9.
  • the link 495 between the site controller 480 and the gateway 320 is governed by Universal Serial Bus (USB) standard.
  • USB Universal Serial Bus
  • the link 495 is governed by RS- 232 or RS-485 standards developed by the Electronic Industries Association, or by TCP/IP.
  • the link 495 supports Ethernet protocol or any other physical and link layer protocols satisfying throughput requirements of the system.
  • the protocol stack 610 employs a distributed network protocol (DNP) for providing end-to-end packet transmission, from the site controller 480 to the player
  • DNP distributed network protocol
  • a distributed network control protocol is for providing management functionality for the network itself including network formation, monitoring, error management and channel allocation. According to the instant invention, the DNCP is used for managing both the gateway 320 and the player 301.
  • the DNP and DNCP transmit messages between the Site Controller 480 and MicroPlayers 301, enabling the Site Controller 480 to discover Gateways 320 as they become connected or disconnected, and to query the gateway 32Oi for a list of connected MicroPlayers and for a list of neighboring Gateways within communication range of the gateway 32O 1 , thus providing the site controller 480 with the means for dynamically managing sub-networks. Gateways within communication distance from one another should use different channels to maximize bandwidth usage.
  • the DNP and DNCP provide a reliable communication channel with authentication, message integrity, sequential freshness, and access control enabling the players 301 to communicate only with the gateway 320.
  • the Simple Network Management Protocol SNMP
  • the TCP/IP stack is used as a DNP.
  • the protocol stack 610 includes a distributed application protocol (DAP) supporting management of the players 301 and transmission of content thereto, independently of the underlying communication technology.
  • DAP distributed application protocol
  • the DAP is used by the site controller 480 to communicate with the players 301 to transport content and commands to the players 301.
  • Each of the players 301 has its own DIG Address and the particular player 301 1 would only pickup messages that are meant for its specific address, allowing targeted messaging on shared communication channels.
  • Each of the players 301 may have different configuration parameters and support different operations.
  • the protocol uses a general block type to handle setting any name-value pairs.
  • Each name-value pair is called a property and has a unique key.
  • the player 301 has a list of property keys it supports and would receive Set Property operations for those keys only.
  • Each DAP operation is encapsulated in a block and all blocks have common structural elements, including a special type code identifying the operation.
  • the DAP blocks are build at the site controller 480, they are transmitted to the player 301 in messages of an underlying protocol, such as the TCP or UDP over IP.
  • Each operation block includes an operation identifier; an address this block is meant for; a data size which is the number of bytes in the data field of the block; data having structure defined by the operation type.
  • the DAP supports the following operations:
  • Ping operation is used to determine the presence of the player 301 and to initialize the connection with it; the meaning of initializing the connection depends on the communication medium being used.
  • the block has no data and the data size of 0.
  • Load DIG Content operation sends a piece of playable content, i.e. a portion of the frame sequence, to the player 301 in the form of a DIG frame sequence for storing in the buffer 340.
  • the data type is raw binary data and the data size is the size of the frame sequence.
  • the player 301 loads the received portion of the frame sequence into the buffer 340.
  • Erase Content operation makes the player 301 to remove all content from the buffer 340.
  • the block has no data and the data size of 0.
  • Reset Device operation makes the player 301 to reset.
  • the block has no data and the data size of 0.
  • Set Property operation advises the player 301 to set a new value for a property specified by its key.
  • the data portion of the block consists of two fields: a Property Key field of a fixed length, and a Value field having length and type dependent on the Property Key, wherein the Property Key represents the property, and the Value provides a new value for the property.
  • a property can be a contrast parameter of the display 370.
  • the DAP supports 1-way mode of operation so that the player 301 can receive data, but cannot send anything back as a response.
  • the player 301 may displays different visual indications to the user. For example, the player 301 provides a visual cue that it was pinged.
  • the DAP supports 2- way mode of operations, wherein the player 301 is capable of sending response messages indicating success of failure of a requested operation.
  • the gateway 320 implements a protocol stack 620, including USB or another standard for communication over the link 495, and IEEE 802.15.4 standard for wireless communication between the gateway 320 and players 301.
  • the players 301 implement a protocol stack 630 including the IEEE 802.15.4 standard.
  • the player 301 includes an RF receiver 330 for receiving one of the frame sequences; a buffer 340, also referred to as player memory 340, for storing the received frame sequence; a display 370 and a video controller 371, associated with the display
  • the RF receiver 330 is a single IEEE 802.15.4 transceiver, for example, MC13202FC commercially available from Freescale Semiconductor, Inc.
  • the buffer 340 may be flash memory, EPROM, or EEPROM.
  • the processing means 360 can be a general purpose processor, one or more microcontrollers, one or more FPGAs, and a combination thereof.
  • the processing means 360 is a single 8- bit microcontroller, also available from Freescale Semiconductor, Inc., and the player 301 has no other processing means beside the 8-bit microcontroller 360, the display microcontroller 371, and a transceiver controller.
  • the display 370 is a small Organic Light Emitting Diode (OLED) display or a Liquid Crystal Display (LCD), such as available from OSRAM Opto Semiconductors.
  • OLED Organic Light Emitting Diode
  • LCD Liquid Crystal Display
  • the display 370 is an OLED display having 128 X 64 resolution and 16 gray scale palette.
  • the players 301 of the instant invention have small screen, having a diagonal size less than 4 inches, and screen resolution, denoted as M x N on FIG. 2, not higher than 320 x 240, and color depth preferably 4 bit or less. By way of example, it takes less than half-an-hour to load 1MB of media content to 10,000 players divided into 25 zones.
  • the firmware running on the microcontroller Upon receiving a message containing a portion of the Load DIG Content block, the firmware running on the microcontroller writes the received portion of the frame sequence directly and linearly into the buffer 340 without any data manipulation, augmenting the previously written part of the frame sequence.
  • a playback the process of displaying a received frame sequence at the player 301, begins automatically if no DAP message is received after a predetermined short delay, such as 2 seconds.
  • the playback includes the following steps: setting the video controller 371 to the origin point of a display; positioning the player buffer 340 to the beginning of the frame sequence, wherein the firmware specifies the memory address of the beginning of the frame sequence in the player buffer; reading the frame sequence data one byte after another from the buffer 340, and sending each byte to the display 370.
  • the display 370 has a 8 bit communication bus, otherwise the frame data is read in portions of a different size dependent on the display communication bus.
  • the playback substantially consists of the above steps, meaning that only minor, less important steps are omitted, such as waiting for read or write operations to complete.
  • the playback as specified is effected by the facts (A) that, in response to the 'load' operation, the raw data of the frame sequence is written linearly into the buffer memory 340 wherein two consecutively received data portions are written in sequential parts of buffer 340 contrary to a conventional file system technique of decomposing data into blocks and storing these blocks of data in multiple locations not necessarily adjacent to each other, and (B) that each frame in the frame sequence received by the player 301 has the format dependent on parameters of the device descriptor associated with the player 301, in particular, each frame contains the same number of pixels as the display 370. Accordingly, the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction on the received frame sequence.
  • the core system 300 includes a mobile controller 481, which can be a PDA, tablet PC, or portable device specialized for retail, executing a software application for configuring the players 301 on site, by connecting to the site controller.
  • the mobile controller 481 connects to the site controller 480 using the WLAN 490 or a separate overlaid network such as a Wi-Fi network supporting IEEE 802.11 standard.
  • partial rendering of content is used to update only a part of the static image file already loaded into the player 301, for example if only the price variable 215 changes and the rest of the frame sequence stays the same.
  • the site controller 480 can transfer and replace only the changed frames to the player, therefor the site controller 480 manages the buffer 340 of the player 301 on a byte level. This is implemented by adding storage byte- addressing to the application protocol which handles loading, or by dividing SDIG files into multiple chunks and addressing by chunks.
  • the site controller 480 includes means for remote management of the buffer for replacing a portion of the frame sequence stored in the buffer: a memory map for the player 301, identifying the current SDIG file(s) loaded on the player 301, memory location where these file(s) are stored at the player 301 ; and a map matching dynamic variables with frames in these SDIG files.
  • the Site Controller as it re-renders new SDIG files, identifies the frames or portions of frames that have actually changed. The Site Controller then transfers only the changed frames or portions to the appropriate players using an addressable load content command.
  • the storage capacity of the player 301 is doubled, providing the ability to preload or buffer content on the player, without replacing existing content. Then, at a specific time, a simple and very fast command can be sent to players to switch to the new content.
  • the low price of storage allows this approach to be taken without greatly increasing the price or complexity of players.
  • the SDIG file is compressed at the site controller 480 before transmission to the player 301.
  • the player 301 stores the compressed content, and after the transfer is complete, the player 301 decompresses the loaded content into an unused portion of storage, retrieving the original static image file. The following playback is not affected by the compression.
  • step 640 the site controller 480 loads a first frame sequence into the first buffer 340 of the player 301, and the player 301 starts playing the content of the first frame sequence, step 645. Then, the site controller 480 compresses a second frame sequence and sends it to the player 301, step 655.
  • step 660 the player 301 receives and stores the compressed file in an unused portion of its storage, a second buffer, not shown in FIG. 4, while the player 301 continues playback its current content.
  • the site controller 480 sends a short command to the player, instructing it to switch to the new still compressed content, step 665.
  • step 670 the player stops playback of the current content, and decompresses the second SDIG file from the second buffer into the first buffer 340 on top of the current content, overwriting it.
  • step 680 the player 301 starts playback of the newly decompressed content and now has a free buffer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A system for rendering and displaying graphical content in the form of a frame sequence, consisting of an animator, a distribution portal, one or more site controllers, one or more wireless transceivers, each connected to one of the site controllers via a fiber, cable, or wire link; and a plurality of players wirelessly connected to the transceivers.

Description

DIGITAL SIGNAGE SYSTEM WITH WIRELESS DISPLAYS
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] The present invention claims priority from U.S. Patent Application No. 60/867,118 filed Nov. 23, 2006, which is incorporated herein by reference.
TECHNICAL FIELD
[0002] The present invention relates to devices for displaying graphical content and, in particular, to large scale Digital Signage systems.
BACKGROUND OF THE INVENTION
[0003] Conventional Digital Signage (DS) systems include large, full-color off-the-shelf displays capable of displaying visual content received from a local server or the Internet. Conventional Digital Signage displays support most industry-standard content formats such as MPEG, Quicktime, Flash, Web Pages, PAL, NTSC, HDTV, etc, decoding and adapting the content at playback time, which requires additional processing power. The high cost of these large high-quality displays requires significant initial deployment investment and may prevent large scale deployments.
[0004] The advent of new wireless technologies targeted at ubiquitous networking, sensors networks, etc. is offering new opportunities. These technologies generally are characterized by very low cost, high reliability, low decoding complexity, short to medium range, low data rate, and large scale in terms of network nodes. However, wireless transferring images to conventional large-size players greatly reduces the amount of content provided to a single display and the number of such displays connected to a single wireless transmitter further hindering scalability of such systems.
[0005] An object of the present invention is to overcome the shortcomings of the prior art and provide a cost-effective scalable DS system having multiple wireless displays. SUMMARY OF THE INVENTION
[0006] Accordingly, the present invention relates to a system for rendering and displaying a plurality of frame sequences comprising a core system comprising: a core instruction memory storing a core system instruction set for converting images into the plurality of frame sequences; a descriptor memory for storing one or more device descriptors; and a core system processing means for executing the core system instruction set and forming the plurality of frame sequences. The system further comprises a plurality of RF transmitters, each comprising first communication means for communicating with the core system over a fiber, cable or wire link; and second communication means for wirelessly transmitting the plurality of frame sequences to a plurality of players; each player comprising: an RF receiver for receiving one of the plurality of frame sequences; a buffer for storing said frame sequence; and a display for displaying said frame sequence; wherein each player is associated with one of the one or more device descriptors; and each of the plurality of frame sequences received by said player has a format dependent on parameters of said device descriptor; so that said player does not perform any step of: resizing, padding, color conversion, and gamma-correction on said frame sequence.
[0007] Another aspect of the present invention relates to a system comprising memory storing design instructions for providing a dynamic image file including one or more placeholder fields for receiving values, query instructions for providing the values for the one or more placeholder fields, and a final converter instructions for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor, as well as processing means for executing these instructions.
[0008] Another aspect of the present invention relates to a system comprising a plurality of players, wherein each of the players comprises an IEEE 802.15.4 receiver, processing means consisting of a 8-bit microcontroller, and an OLED display with resolution not higher than 200 x 100. BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The invention will be described in greater detail with reference to the accompanying drawings, wherein:
[0010] Figure 1 is a block scheme of the method of creating and displaying graphical content in accordance with the instant invention;
[0011] Figure 2 is a schematic illustration of data transformations within one frame;
[0012] Figure 3 is an illustration to a rendering process;
[0013] Figure 4 is a block scheme of a system for creating and displaying a plurality of frame sequences in accordance with the instant invention;
[0014] Figure 5 illustrates components of a core system instruction set;
[0015] Figure 6 is a schematic representation of distributed system in accordance with the instant invention;
[0016] Figure 7 is a schematic representation of WLAN;
[0017] Figure 8 is a schematic representation of the gateway 320;
[0018] Figure 9 is a schematic diagram of communication protocols used in WLAN;
[0019] Figure 10 is a block schema of a method of displaying content at a player in accordance with one embodiment of the instant invention; and
[0020] Table 1 is a representation of the rendering process.
DETAILED DESCRIPTION
[0021] One application of the instant invention is a retail DS system for presenting customers with product related advertisement and price information, including at least 40 content players. Typically, it is a multi-store system having as many as 20,000 store shelf displays, or players, at a single retail location. This system will be used for illustration purposes throughout the instant application, however the invention is not limited to such systems.
[0022] With reference to FIGs. 1 and 2, a method of creating and displaying graphical content will be described, wherein FIG. 1 presents method steps, and FIG. 2 illustrates transformations of data within one frame. In a content choosing step 110, graphical content for displaying at a DS player is chosen in an iterative process which includes selecting variable content and permanent content, and displaying a resulting image for evaluation.
[0023] During the selection of permanent content, graphical sources for displaying at a DS player are chosen. The permanent content includes such assets as images, video, animation, and text, as well as content meant for another media, such as television or the Internet. The permanent content, at least in part, is chosen from an asset store within the DS system or is imported. Alternatively, the permanent graphic content or its part is created using any editor for the particular system, for example by a professional graphic artist.
[0024] The variable content includes dynamic variables, such as a product price. The dynamic variables can be associated with products, product categories, customers, or entire networks. During the selection of the variable content, the dynamic variables are identified and labeled. A data source for providing value(s) for a dynamic variable, for example a retail backend database, may be identified at this step. Optionally, a dynamic variable can be parameterized by specifying its data type, maximum length, or dimensions of the space available on display.
[0025] A resulting frame or frame sequence, combining the permanent content and the variable content, is then displayed to simulate, as close as possible, the experience of a user viewing the DS player, wherein default values are displayed for not yet known dynamic variables. Preferably, a device descriptor associated with the DS player is used at this step so that the frame or frame sequence is created and optimized specifically for the particular player model; the process of creating the frame sequence ready for viewing is described in more detail later in reference to steps 120 and 140. The device descriptor includes a set of parameters related to a target player, such as display resolution, color depth, physical size, storage capacity, color palette, supported functionalities. In the instance of two players having different parameters, more than one device descriptors may be employed. For simulating the user experience, the chosen assets are preferably adapted to the right color depth and resolution, wherein the conversion can be optimized using visual tools to provide the best output.
[0026] One or more project files result from the content choosing step 110. The project files include the chosen assets, or references to them, the chosen dynamic variables, and, optionally, the device descriptor and additional effects and transitions. Optionally, some of the chosen assets within the project file are adapted using the device descriptor as described hereinbefore.
[0027] In reference to FIG. 2, permanent content consisting of an image 200 is chosen from the asset store 205. Variable content including a dynamic variable 215 is combined with the image 200 in a frame 210 described in a project file.
[0028] In a dynamic image file creating step 120, all the frame data which is available, such as the permanent content and the additional effects and transitions, is rendered, or converted, into a format that can be played back without decoding by a player, also referred to herein as a player specific format and raw data format. The dynamic content is stored as placeholders with information on which dynamic variables to use and how to format them. Dependent on the parameters in the device descriptor, this partial formatting may significantly reduce content size by removing unused asset data. Within the dynamic image file creating step 120, the project file is transformed into a dynamic image file (DIG) including one or more placeholder fields for receiving values. The dynamic image file is structured so to contain various information blocks describing the content and its location within a frame.
[0029] The DIG Format is structured in that it contains various information blocks describing the content and its frame timing. In other words, while the format's structure is universal over all player devices, any frame data within a dig file is player-model specific. A raw frame data can be described as a sequence of bytes that can be sent directly from the player's storage unit to the video controller of the display without analysis or processing. More specifically, the data itself is: a set of color indexes representing pixels, a sequencing order for these indexes, a way of packing this sequence of indexes in bytes, where the color indexes, sequencing order, and packing method are defined by and are specific to the display or its video controller. [0030] In reference to FIG. 2, a frame 220 illustrates adaptation of the image 200 using a player's device descriptor, which specifies, by way of example, a black and white M x N display. The upper K rows of the frame 220 contain the image 200 converted to K x N pixels, and the bottom (M - K) rows are reserved for the value of the dynamic variable 215. In the dynamic file 230, each black or white pixel is represented by a single bit, and a first bit sequence 25Oi represents the first row in the frame 220, and the last bit sequence 250κ represents the last row in the frame 220. The placeholder 240 contains information related to the dynamic variable 215.
[0031] The rendering process consists of 4 steps, each step working with one layer of the content.
[0032] A Screen Layer is a parent layer upon which other layers are drawn; it includes the background area is within the bounds of the actual display. Any part of a layer that is outside the bounds of the screen layer will not be visible. The actual size of this layer in pixels is determined by the target display. The screen layer is in screen space which has coordinates (0,0) to (device resolution width - 1 , device resolution height - 1 ).
[0033] An Image Layer contains the asset/source image/animation/video drawn onto the screen layer using an affine transformation defined by a view. Varying this view changes the position of the image on screen and can create animations. The view is automatically generated at creation such that the image is scaled to fit completely within the screen space. This means the image can fill the screen horizontally and/or vertically, but always at least in one dimension, i.e. there may be empty borders around the image. The image layer is in image space which has coordinates (0,0) to (image width - 1, image height - 1).
[0034] A Text Layer: a sequence of text labels drawn on top of the image layer but relative to the screen layer using an affine transformation defined by a view. Varying this view changes the position of the text on screen and can create animations. Note that the (0,0) coordinate of a text label is the left point on the text's baseline (i.e. the bottom left corner of the bounding rectangle). [0035] A View layer gives the set of parameters that specify the scaling and translation transformations that must be applied to a source, for example one chosen from the asset store 205, when placed in a target space. Scaling is applied to the source's dimensions directly, but the translation specified by the minX and minY properties is relative to the target space. In other words, the results of applying a View is typically to scale the source first, and then place it at point (minX, minY) in the target space.
[0036] By way of example, if the source has size of 800x600 pixels, the target space is 128x64, minX=10,minY=20; scaleX=scaleY=0.1 (reduce to 10%), then the result will be a scaled source image with size 80x60 positioned at (10, 20) in target space.
[0037] A view's source and target spaces need not be different. It can also be used to position and scale an element within a single space. FIG. 3 provides an example of Text Labels drawn into screen space.
[0038] Table 1 lists the steps of the rendering pipeline.
[0039] The creation of a dynamic image file from a project file includes color reduction of all project assets to a player-specific color palette, for example in the RGB or RGBa color space, and encoding individual frames or frame portions to the raw data format compatible with the video controller of the target display.
[0040] Within the dynamic image file creating step 120, assets in the form of text, images or video are adapted from their original format and color depth to that of the player, using a standard color reduction technique, such as a nearest neighbor algorithm or an error diffusion algorithm with a Floyd-Steinberg filter disclosed, for example, in U.S. Patent Nos. 6,844,882 issued Jan. 18, 2005 to Clauson, 7,171,045 issued Jan. 30, 2007 to Hamilton, and 6,201,612 issued Mar. 13, 2001 to Matsushiro et al., and a player-dependent palette, by way of example, a palette with 16 shades of one color or a 8-bit gray scale palette. The palette is indexed by attaching an index to a color, wherein the index is typically a 1 , 2, 4 or 8 bit value.
[0041] In one embodiment, to choose the palette the player is configured to exhibit a gamma curve as smooth as possible by plotting electrical characteristics of the display for each color index and adjusting the palette so that the difference in apparent intensity as perceived by the human eye between any two consecutive color indices is constant. Then each color of the palette is converted into the RGB metric. Advantageously, this technique eliminates the need for gamma correction on the player
[0042] After the color reduction step each pixel has an RGB value available in the palette of the target player. Then, a frame converts the image data to the raw data format by converting RGB values into color indexes, followed by a byte packing algorithm that aligns the color indexes within a byte stream. The result is a raw byte sequence for each frame that can be streamed directly to the display's video controller.
[0043] Preferably, the same algorithm of encoding assets into DIG format is used in the content choosing step 110 for simulating the experience of a user viewing the DS player.
[0044] In a query step 130, values of dynamic variables to fill the placeholder fields in the dynamic image file are obtained, preferably by querying a database 245, which can be a part of the DS system or an external database such as s retail backend database. Alternatively, the current values of dynamic variables are provided by a central distribution system or by a local backend system such as a store back office. Alternatively, the values are provided manually. Optionally, the DS system synchronizes the displayed values with the database so as to change the displayed content as some of the dynamic variables change their values stored in the database.
[0045] In a creating static image file step 140, after the dynamic image file and the values obtained in the query step 130 are combined into a static image file (SDIG) containing a frame sequence having the player specific format dependent on the parameters of the device descriptor, so that the resulting static file can be played back by the player without decoding, as will be discussed further in this specification. All placeholders are substituted with the values obtained in the query step 130 converted to pixel data, as described above in reference to the dynamic file creation step 120, and merged with pre-formatted frames. This process occurs every time a dynamic DIG file changes or every time a value of a dynamic variable changes. In the instance of a dynamic variable used by several DIG files, following the change of this variable all the files are updated and sent to the corresponding players. [0046] In accordance with the instant invention, the static image file contains frame data in a device-specific way, in a format dependent on parameters of the device descriptor, and does not require any decoding at the player. From the very beginning, content is created for a target player by a content creation tool described later. The frame data is a sequence of bytes which can be sent directly from the player's memory to the video controller of the display without analysis or processing, such as decompression, resizing, padding, color conversion, gamma- correction, etc. More specifically, the data itself is a set of color indexes representing pixels, sequenced for being sent directly from a buffer to the video controller of the display, and packed into bytes, wherein the color indexes, sequencing order, and packing method are defined by and are specific to the player.
[0047] In reference to FIG. 2, the static image file 260 includes bit sequences 25Oi - 250κ copied from the dynamic file 230, and bit sequences 250κ+i - 250M containing the value 247 of the dynamic variable 215 in the pixel form.
[0048] A step of defining distribution parameters 150 defines playback schedule, target product, geographical and other parameters. The defining distribution parameters step 150 is optional: in the instance of a DS system providing the same frame sequence to all the DS players having the same device descriptor, the step of defining distribution parameters 150 is not necessary. Choice of the DS distribution parameters denoted by step 150 can be performed at any time within the timeframe of the method until the moment these parameters are used in step 160.
[0049] The distribution step 160 is for distributing content and schedules over the Internet or any IP-based network to multiple individual site controllers, each managing multiple players at a single location, such as a store. The site controllers will be discussed in more detail further in this specification. In one embodiment of the instant invention, the content is distributed in the form of dynamic image files including one or more placeholder fields for receiving values, whereas each site controller provides the values independently; relative to FIG. 1, the distribution step 160 is performed after the step of creating a dynamic file 120 and before the query step 130. In another embodiment, the content is distributed in the form of static image files wherein all the placeholders are already substituted with queried values, so that the distribution step 160 is performed after the creating static file step 140. In a very small configuration of the system including only one site controller performing the steps of content creation 110-140, the distribution step 160 is not performed.
[0050] The values of dynamic variables can also be transmitted as part of the distribution process if these values are imported, received, or synchronized by the distribution system. The transmission of content and dynamic values is usually independent since they change separately and asynchronously.
[0051] A content delivery step 170 includes transmitting static image files over a wireless LAN (WLAN) from the site controllers to multiple players. Optionally, the static image files are compressed before the transmission. The content is loaded in accordance with the campaign schedule defined at the step 150 or after the static image file has been updated. Content delivery, or load, is initiated by the site controller.
[0052] Upon receiving a static image file containing a frame sequence over the WLAN, a player stores this content in local, non- volatile memory, in a content storing step 180. This process can include decompression that occurs only once per transmission but is performed after transmission has completed.
[0053] In a content playing step 190, a microprocessor within the player reads its local nonvolatile memory and sends the frame data directly to the display's video controller. No processing is performed on the frame data, in particular, no steps of decompression, resizing, padding, color conversion, and gamma-correction are performed on the data. FIG. 2 shows image 270 displayed by the player.
[0054] An implementation of the method described hereinbefore with reference to FIGs. 1 and 2 will be discussed now. A system for creating and displaying a plurality of frame sequences shown in FIG. 4 includes a core system 300, a plurality of RF transmitters, and a plurality of players, wherein only one RF transmitter 320 and one player 301 are shown. In operation, the core system 300 creates graphical content in the form of frame sequences and wirelessly loads it into the players 301 for displaying. [0055] The core system 300 includes memory 305 having at least two parts: a core instruction memory 306 storing a core system instruction set for converting images into the plurality of frame sequences, the asset store 205, and a descriptor memory 307 for storing one or more device descriptors associated with the players 301 so that each player 301 is associated with one device descriptor. The examples of memory components 305 and 307 include random access memory (RAM), non-volatile memory such as a read-only memory (ROM), flash memory, Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read Only Memory (EEPROM), etc., and a disk storage device. A disk storage device can include any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), a DVD, a DVD+RW, and the like.
[0056] The core system 300 also includes a core system processing means 310 such as one or more processors, microprocessors, controllers, and the like, for executing the core system instruction set and forming the plurality of frame sequences.
[0057] In reference to FIG. 5, the core system instruction set 400 stored in the core instruction memory 306 includes an image converter 401 for converting an image into a part of a frame for displaying on a screen of the player 301. In operation, the image converter 401 executed by the processing means 310 is employed in the dynamic image file creating step 120 and the static image file creating step 140. Relevant to the example shown in FIG. 2, the converter image 401 converts the upper part of the screen 210 during the dynamic image file creating step 120, and the bottom part of the screen 220 during the static image file creating step
140 after the placeholder has been substituted with the queried value.
[0058] The image converter 401 includes: first instructions 410 for providing frame pixels based on the image, wherein a number of the frame pixels is equal to a number of screen pixels in a corresponding part of the display screen; and second instructions 420 implementing one of the color reduction algorithm discussed relevant to the step 120, for reducing the number of colors of the image not to exceed the color resolution of the player 301 and for color-indexing the image pixels; wherein the number of screen pixels, the number of colors, and the palette of the player 301 are obtained from the device descriptor associated with the player 301; so that the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction, when displaying the frame sequence.
[0059] Preferably, the core system instruction set 400 includes design instructions 430 for providing a dynamic image file including one or more placeholder fields for receiving values, executed at the creating dynamic image file step 120; query instructions 440 for providing the values for the placeholder fields, executed at the query step 130; and a final converter 450 for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor, executed at the creating static image file step 140. The aforedescribed converter instructions 400 are called by the designer 430 and the final converter 450.
[0060] Optionally, the core system instruction set 400 includes campaign manager 460 for defining DS distribution parameters, and supporting scheduled delivery of frame sequences to the players.
[0061] In one embodiment of the present invention, the method illustrated by FIG. 1 is simplified to disallow use of dynamic variables, therefore to exclude the creating dynamic file step 120 and query step 130. Correspondingly, in reference to FIG. 5, the core instruction set 400 does not include the design instructions 430 and the query instructions 440.
[0062] The core system 300 can reside on one computer or be a distributed system, wherein parts of the system 300 communicate over a network.
[0063] In one embodiment shown in FIG. 6, the core system 300 is a distributed system 30O1 consisting of an animator 460, a distribution portal 470, and one or more site controllers 480 spatially separated from the rest of the system 30O1.
[0064] The animator 460 is a content creation tool for micro-advertisement and micro- promotions designed as a creative tool for non-professionals, and an adaptation and conversion tool for content creation professional. The animator 460 is for executing at least a part of the design instructions to perform the content choosing step 110, and, optionally, the steps 120 -150 of creating the dynamic file, querying at least some of the dynamic variables, creating the static image file if all the dynamic variables are found, and defining the distribution parameters. By way of example, the animator 460 is the MicroAnimator software implemented by DigiCharm Inc.
[0065] The distribution portal 470 is for performing the distribution step 160 and providing centralized management of all players 301 on all sites, regrouped by customer, region, store, product, etc., as well as advertisement and promotion campaign management, including scheduling and regional targeting. Optionally, the distribution portal 470 includes the database
245, and performs the query step 130. The distribution portal 470 may be an external application integrated into the architecture shown in FIG. 6 by interfacing with the animator 460 and the site controllers 480. Most existing Content Distribution systems can be used, for example such as disclosed in U.S. Patent No. 6785704.
[0066] The animator 460 and the distribution portal 470 form together a design station, either distributed or residing in one location. The design station has the design instructions 430 stored in a design station memory and a design station interface for providing the dynamic image file or the static image file over a network to the site controllers 480, such as an Ethernet card or chip, a wireless transceiver, or any other interface. The query instructions and the final converter can be stored at the design station or at the site controller.
[0067] Each site controller 480 is an embedded PC or small server running a software application that manages an entire location, such as a store. The site controller 480 interfaces with the distribution system 470 to receive and store content and schedules for a single site.
[0068] The site controller 480 has first communication means for communicating with the design station over the network, including an interface, such as an Ethernet card or chip, a wireless transceiver, or any other interface, and software such as a TCP/IP stack. The site controller 480 also has second communication means for communicating with at least one of the RF transmitters, located separately from the site controller, over a fiber, cable, wire link, or the like, which is preferably an USB interface, and can be an Ethernet or any other interface. In the instance when the same physical interface is used both for communication with the design station and the RP transmitters, different parts of the two communication means are software instructions providing different addressing and different treatment of received data. [0069] The site controller 480 is responsible for finalizing the static image file, if it has not been done yet by other components of the system, and, for this purpose, receives the dynamic image file and the remaining dynamic variables from the distribution system 470. Alternatively, the dynamic variables are queried from the database 245, or from a local backend system such as the store back office, and/or the site controller 480 provides an interface for manual input of values for the dynamic variables. After values for all the dynamic variables are provided, the site controller converts the dynamic image file into the static image file.
[0070] The site controller 480 distributes the static image file(s) to the players 301 based on the schedules received from the distribution system 470, manages and monitors the players 301 within the location.
[0071] For converting a plurality of frame sequences in a player-specific format, the site controller has the design instruction, the query instructions, and the final converter instructions stored in the memory of the site controller and processing means for executing these instructions, as well as a device descriptor memory for storing one or more device descriptors, so that each player is associated with one device descriptor; and each frame sequence received by the player 301 has a format dependent on parameters of the device descriptor associated with the player 301; so that the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction on said frame sequence.
[0072] The site controller 480 has at least one port for communicating with the animator 460 and/or the distribution portal 470 over a network, and at least one port for communicating with at least one of the RF transmitters, located separately from the site controller, over the fiber, cable, or wire link, by way of example an USB port.
[0073] In one embodiment, the site controller 480 is connected to a conventional Digital Signage infrastructure and provides distribution of the content and schedules supplied by the infrastructure to the players 301 and reports playback statistics and errors to the central system.
[0074] In one embodiment, the site controller 480 includes an RF transmitter for communication with the players 301. [0075] Alternatively, the site controller 480 controls a wireless local access network (WLAN) 490 formed by the plurality of RF transmitters 320 for providing wirelessly connection between the site controller 480 and the players 301. The WLAN 490 has a two-layer star topology shown in FIG. 7, wherein one or more RF transmitters 320, also referred to as gateways 320, are connected to the site controller 480. Since each wireless hop has a high impact on total throughput, the WLAN 490 has wireless hops 485 only between the gateway 320 and the players 301.
[0076] Each gateway 320 is a combination of hardware and firmware supporting and bridging two communication protocols. In reference to FIG. 8, the gateway 320 includes two types of communication means: a first interface 710 for communicating with the site controller 480 which is a part of the core system 300, over a fiber, cable or wire link 495; and a second interface 720 for wirelessly transmitting the plurality of frame sequences to the players 301. The first interface 710 can be a USB interface, or an Ethernet card or chip, or the like, and the second interface 720 is a wireless transceiver. Processing means 730 are for protocol translation between wireline and wireless connections.
[0077] In one embodiment, the WLAN 490 supports a wireless standard IEEE 802.15.4 and is segmented into sub-networks or zones each having a radius of approximately 10 meters and a number of players ranging between 20 and 1000, dependent of the geometry of the location and the size and renewal rate of the content, wherein the gateway 320 is disposed at the center of the zone. Preferably, the numbers of players in each zone should be balanced across the zones. The gateway 320 supports a wireless standard IEEE 802.15.4 transmits using one or more channels selected out of the 16 channels. In order to reduce interference, the channels are selected so as to avoid having two adjacent or overlapping zones communicating on the same channel. Preferably, all the gateways 320 support IEEE 802.15.4 standard for communication with the plurality of players.
[0078] In one embodiment, the site controller 480 includes means for dynamically managing sub-networks formed of the plurality of players in proximity to each of the RF transmitters connected to the site controller 480, implemented as a network management software. The site controller 480 manages all the sub-networks, including discovery of devices in the network, maintains adjacency lists specifying communication distances between two devices on the network and dynamically forms sub-networks. The site controller 480 monitors the local WLAN 490, collects playback statistics for the location, and provides the playback statistics and monitoring information including any detected errors to the distribution system 470.
[0079] Optionally, the site controller 480 provides load balancing distributing the content among the gateways 320 to achieve on wireless links 485 as high as possible "link quality" defined by the IEEE 802.15.4 standard, so that the WLAN 490 is a network having the site controller 480, a plurality of gateways 320 each connected over the fiber, cable, or wire link 495 to the site controller 480, and a plurality of displays 301, each wirelessly connected using the IEEE 802.15.4 protocol to one of the gateways 320, wherein at least one of the displays 301 is a dual homed display 3011 so it can receive wireless signals from two of the gateways 32O1 and 32O2, and wherein the site controller performs load balancing by sending messages to the dual homed display 301 j via a less busy of the two gateways 32O2.
[0080] Additionally, the site controller 480 provides authentication of players using an Access Control List (ACL), message integrity, and sequential freshness to avoid replay attacks.
[0081] In reference to FIG. 9, communication protocols used in the WLAN 490 will be discussed now.
[0082] By way of example, the site controller 480 implements a protocol stack 610 shown in FIG. 9. The link 495 between the site controller 480 and the gateway 320 is governed by Universal Serial Bus (USB) standard. In another embodiment the link 495 is governed by RS- 232 or RS-485 standards developed by the Electronic Industries Association, or by TCP/IP. Alternatively, the link 495 supports Ethernet protocol or any other physical and link layer protocols satisfying throughput requirements of the system.
[0083] At the network layer, the protocol stack 610 employs a distributed network protocol (DNP) for providing end-to-end packet transmission, from the site controller 480 to the player
301. A distributed network control protocol (DNCP) is for providing management functionality for the network itself including network formation, monitoring, error management and channel allocation. According to the instant invention, the DNCP is used for managing both the gateway 320 and the player 301.
[0084] The DNP and DNCP transmit messages between the Site Controller 480 and MicroPlayers 301, enabling the Site Controller 480 to discover Gateways 320 as they become connected or disconnected, and to query the gateway 32Oi for a list of connected MicroPlayers and for a list of neighboring Gateways within communication range of the gateway 32O1, thus providing the site controller 480 with the means for dynamically managing sub-networks. Gateways within communication distance from one another should use different channels to maximize bandwidth usage. The DNP and DNCP provide a reliable communication channel with authentication, message integrity, sequential freshness, and access control enabling the players 301 to communicate only with the gateway 320. In one embodiment, the Simple Network Management Protocol (SNMP) is used as a DNCP and the TCP/IP stack is used as a DNP.
[0085] At the application layer, the protocol stack 610 includes a distributed application protocol (DAP) supporting management of the players 301 and transmission of content thereto, independently of the underlying communication technology.
[0086] The DAP is used by the site controller 480 to communicate with the players 301 to transport content and commands to the players 301. Each of the players 301 has its own DIG Address and the particular player 3011 would only pickup messages that are meant for its specific address, allowing targeted messaging on shared communication channels.
[0087] Each of the players 301 may have different configuration parameters and support different operations. To simplify this diversification, the protocol uses a general block type to handle setting any name-value pairs. Each name-value pair is called a property and has a unique key. The player 301 has a list of property keys it supports and would receive Set Property operations for those keys only.
[0088] Each DAP operation is encapsulated in a block and all blocks have common structural elements, including a special type code identifying the operation. After the DAP blocks are build at the site controller 480, they are transmitted to the player 301 in messages of an underlying protocol, such as the TCP or UDP over IP.
[0089] Each operation block includes an operation identifier; an address this block is meant for; a data size which is the number of bytes in the data field of the block; data having structure defined by the operation type..
[0090] The DAP supports the following operations:
[0091] Ping operation is used to determine the presence of the player 301 and to initialize the connection with it; the meaning of initializing the connection depends on the communication medium being used. The block has no data and the data size of 0.
[0092] Load DIG Content operation sends a piece of playable content, i.e. a portion of the frame sequence, to the player 301 in the form of a DIG frame sequence for storing in the buffer 340. The data type is raw binary data and the data size is the size of the frame sequence. Upon receiving this block, the player 301 loads the received portion of the frame sequence into the buffer 340.
[0093] Erase Content operation makes the player 301 to remove all content from the buffer 340. The block has no data and the data size of 0.
[0094] Reset Device operation makes the player 301 to reset. The block has no data and the data size of 0.
[0095] Set Property operation advises the player 301 to set a new value for a property specified by its key. The data portion of the block consists of two fields: a Property Key field of a fixed length, and a Value field having length and type dependent on the Property Key, wherein the Property Key represents the property, and the Value provides a new value for the property.
Upon receiving this command, the player 301 changes the property value and may provide a visual cue that the property has been changed. By way of example, a property can be a contrast parameter of the display 370.
[0096] In one embodiment, the DAP supports 1-way mode of operation so that the player 301 can receive data, but cannot send anything back as a response. In case of a successful completion of an operation or in case of an error, the player 301 may displays different visual indications to the user. For example, the player 301 provides a visual cue that it was pinged. In one embodiment the DAP supports 2- way mode of operations, wherein the player 301 is capable of sending response messages indicating success of failure of a requested operation.
[0097] In reference to FIG. 9, the gateway 320 implements a protocol stack 620, including USB or another standard for communication over the link 495, and IEEE 802.15.4 standard for wireless communication between the gateway 320 and players 301. The players 301 implement a protocol stack 630 including the IEEE 802.15.4 standard.
[0098] In reference to FIG. 4, the player 301 includes an RF receiver 330 for receiving one of the frame sequences; a buffer 340, also referred to as player memory 340, for storing the received frame sequence; a display 370 and a video controller 371, associated with the display
370, for displaying the received frame sequence; and a player processing means 360 for controlling the components of the player described before. Preferably, the RF receiver 330 is a single IEEE 802.15.4 transceiver, for example, MC13202FC commercially available from Freescale Semiconductor, Inc. The buffer 340 may be flash memory, EPROM, or EEPROM.
The processing means 360 can be a general purpose processor, one or more microcontrollers, one or more FPGAs, and a combination thereof. Preferably, the processing means 360 is a single 8- bit microcontroller, also available from Freescale Semiconductor, Inc., and the player 301 has no other processing means beside the 8-bit microcontroller 360, the display microcontroller 371, and a transceiver controller.
[0099] The display 370 is a small Organic Light Emitting Diode (OLED) display or a Liquid Crystal Display (LCD), such as available from OSRAM Opto Semiconductors. By way of example, the display 370 is an OLED display having 128 X 64 resolution and 16 gray scale palette.
[00100] The players 301 of the instant invention have small screen, having a diagonal size less than 4 inches, and screen resolution, denoted as M x N on FIG. 2, not higher than 320 x 240, and color depth preferably 4 bit or less. By way of example, it takes less than half-an-hour to load 1MB of media content to 10,000 players divided into 25 zones. [00101] Upon receiving a message containing a portion of the Load DIG Content block, the firmware running on the microcontroller writes the received portion of the frame sequence directly and linearly into the buffer 340 without any data manipulation, augmenting the previously written part of the frame sequence.
[00102] A playback, the process of displaying a received frame sequence at the player 301, begins automatically if no DAP message is received after a predetermined short delay, such as 2 seconds. The playback includes the following steps: setting the video controller 371 to the origin point of a display; positioning the player buffer 340 to the beginning of the frame sequence, wherein the firmware specifies the memory address of the beginning of the frame sequence in the player buffer; reading the frame sequence data one byte after another from the buffer 340, and sending each byte to the display 370. Preferably, the display 370 has a 8 bit communication bus, otherwise the frame data is read in portions of a different size dependent on the display communication bus. The playback substantially consists of the above steps, meaning that only minor, less important steps are omitted, such as waiting for read or write operations to complete.
[00103] The playback as specified is effected by the facts (A) that, in response to the 'load' operation, the raw data of the frame sequence is written linearly into the buffer memory 340 wherein two consecutively received data portions are written in sequential parts of buffer 340 contrary to a conventional file system technique of decomposing data into blocks and storing these blocks of data in multiple locations not necessarily adjacent to each other, and (B) that each frame in the frame sequence received by the player 301 has the format dependent on parameters of the device descriptor associated with the player 301, in particular, each frame contains the same number of pixels as the display 370. Accordingly, the player 301 does not perform any step of: decompression, resizing, padding, color conversion, and gamma-correction on the received frame sequence.
[00104] Optionally, the core system 300 includes a mobile controller 481, which can be a PDA, tablet PC, or portable device specialized for retail, executing a software application for configuring the players 301 on site, by connecting to the site controller. The mobile controller 481 connects to the site controller 480 using the WLAN 490 or a separate overlaid network such as a Wi-Fi network supporting IEEE 802.11 standard. [00105] In one embodiment, partial rendering of content is used to update only a part of the static image file already loaded into the player 301, for example if only the price variable 215 changes and the rest of the frame sequence stays the same. The site controller 480 can transfer and replace only the changed frames to the player, therefor the site controller 480 manages the buffer 340 of the player 301 on a byte level. This is implemented by adding storage byte- addressing to the application protocol which handles loading, or by dividing SDIG files into multiple chunks and addressing by chunks.
[00106] For support of the partial rendering, the site controller 480 includes means for remote management of the buffer for replacing a portion of the frame sequence stored in the buffer: a memory map for the player 301, identifying the current SDIG file(s) loaded on the player 301, memory location where these file(s) are stored at the player 301 ; and a map matching dynamic variables with frames in these SDIG files. Whenever a new value is received for a dynamic variable, the Site Controller, as it re-renders new SDIG files, identifies the frames or portions of frames that have actually changed. The Site Controller then transfers only the changed frames or portions to the appropriate players using an addressable load content command.
[00107] In one embodiment, the storage capacity of the player 301 is doubled, providing the ability to preload or buffer content on the player, without replacing existing content. Then, at a specific time, a simple and very fast command can be sent to players to switch to the new content. The low price of storage allows this approach to be taken without greatly increasing the price or complexity of players.
[00108] In one embodiment, the SDIG file is compressed at the site controller 480 before transmission to the player 301. The player 301 stores the compressed content, and after the transfer is complete, the player 301 decompresses the loaded content into an unused portion of storage, retrieving the original static image file. The following playback is not affected by the compression.
[00109] In reference to FIG. 10, in step 640 the site controller 480 loads a first frame sequence into the first buffer 340 of the player 301, and the player 301 starts playing the content of the first frame sequence, step 645. Then, the site controller 480 compresses a second frame sequence and sends it to the player 301, step 655. In step 660, the player 301 receives and stores the compressed file in an unused portion of its storage, a second buffer, not shown in FIG. 4, while the player 301 continues playback its current content. When the campaign schedule specifies that the new content should be played, the site controller 480 sends a short command to the player, instructing it to switch to the new still compressed content, step 665. Then, in step 670, the player stops playback of the current content, and decompresses the second SDIG file from the second buffer into the first buffer 340 on top of the current content, overwriting it. Finally, in step 680, the player 301 starts playback of the newly decompressed content and now has a free buffer.
Figure imgf000025_0001
TABLE 1

Claims

WE CLAIM:
1. A system for rendering and displaying a plurality of frame sequences comprising: a core system comprising: a core instruction memory storing a core system instruction set for converting images into the plurality of frame sequences; a descriptor memory for storing one or more device descriptors; and a core system processing means for executing the core system instruction set and forming the plurality of frame sequences; a plurality of RF transmitters, each comprising a first interface for communicating with the core system over a fiber, cable or wire link; and a second interface for wirelessly transmitting the plurality of frame sequences to a plurality of players; each player comprising: an RF receiver for receiving one of the plurality of frame sequences; a buffer for storing said frame sequence; and a display for displaying said frame sequence; wherein each player is associated with one of the one or more device descriptors; and each of the plurality of frame sequences received by said player has a format dependent on parameters of said device descriptor; so that said player does not perform any step of: resizing, padding, color conversion, and gamma-correction on said frame sequence.
2. A system defined in claim 1, wherein the core system instruction set comprises: design instructions for providing a dynamic image file including one or more placeholder fields for receiving values; query instructions for providing the values for the one or more placeholder fields; and a final converter for encoding the dynamic image file and the values into a static image file containing a frame sequence having the format dependent on the parameters of the device descriptor.
3. A system defined in claim 2, wherein the core system comprises
a design station comprising a design station memory having the design instructions and a design station interface for providing the dynamic image file or the static image file over a network; and
a site controller comprising first communication means for communicating with the design station over a network, and second communication means for communicating with at least one of the RF transmitters, located separately from the site controller, over the fiber, cable, or wire link;
and wherein the query instructions and the final converter are stored at the design station or at the site controller.
4. A system defined in claim 3, wherein the second communication means include an USB interface.
5. A system defined in claim 3, wherein the site controller includes means for dynamically managing sub-networks formed of the plurality of players in proximity to each of the plurality of RP transmitters connected to the site controller.
6. A system defined in claim 3, wherein the site controller includes means for remote management of the buffer for replacing a portion of the frame sequence stored in the buffer.
7. A system defined in claim 3, wherein the design station includes a portable animator device for executing at least a part of the design instructions.
8. A system defined in claim 1, wherein each of the plurality of RF transmitters and each of the RF receivers supports IEEE 802.15.4 standard for communication with the plurality of players.
9. A system defined in claim 1, wherein each of the plurality of players further comprises processing means consisting of a 8-bit microcontroller.
0. A system defined in claim 1 wherein the display in each of the plurality of players is an OLED display with resolution not higher than 320 x 240.
PCT/CA2007/002116 2006-11-23 2007-11-23 Digital signage system with wireless displays WO2008061372A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US86711806P 2006-11-23 2006-11-23
US60/867,118 2006-11-23

Publications (1)

Publication Number Publication Date
WO2008061372A1 true WO2008061372A1 (en) 2008-05-29

Family

ID=39429351

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2007/002116 WO2008061372A1 (en) 2006-11-23 2007-11-23 Digital signage system with wireless displays

Country Status (1)

Country Link
WO (1) WO2008061372A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130076765A1 (en) * 2011-09-28 2013-03-28 Hakhyun Nam Image Data Displaying System and Method for Displaying Image Data
DE102015015542A1 (en) * 2015-11-16 2017-06-01 Infoscreen Gmbh A method and digital signage system for displaying displayable strings on distributed display devices

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005038629A2 (en) * 2003-10-17 2005-04-28 Park Media, Llc Digital media presentation system
US20060015531A1 (en) * 2004-07-19 2006-01-19 Moshe Fraind Device and system for digital signage
CA2591305A1 (en) * 2005-01-04 2006-07-13 Avocent California Corporation Wireless streaming media systems, devices and methods

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005038629A2 (en) * 2003-10-17 2005-04-28 Park Media, Llc Digital media presentation system
US20060015531A1 (en) * 2004-07-19 2006-01-19 Moshe Fraind Device and system for digital signage
CA2591305A1 (en) * 2005-01-04 2006-07-13 Avocent California Corporation Wireless streaming media systems, devices and methods

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130076765A1 (en) * 2011-09-28 2013-03-28 Hakhyun Nam Image Data Displaying System and Method for Displaying Image Data
DE102015015542A1 (en) * 2015-11-16 2017-06-01 Infoscreen Gmbh A method and digital signage system for displaying displayable strings on distributed display devices
DE102015015542B4 (en) * 2015-11-16 2017-10-19 Infoscreen Gmbh A method and digital signage system for displaying displayable strings on distributed display devices

Similar Documents

Publication Publication Date Title
US8898255B2 (en) Network digital signage solution
CN1842088B (en) System for efficient remote projection of rich interactive user interfaces
CN100437453C (en) Tag information display control apparatus, information processing apparatus, display apparatus, tag information display control method and recording medium
CN1813240B (en) Method and system to display media content data
CN102571930B (en) Distributed smooth streaming utilizing dynamic manifests
CN100385435C (en) Systems and methods for providing color management
CN105594204A (en) Transmitting display management metadata over HDMI
JP4413629B2 (en) Information display method, information display device, and information distribution display system
JP5361746B2 (en) Multi-division display content and method for creating the system
CN105144729A (en) Transmitting display management metadata over HDMI
CN103650526A (en) Playlists for real-time or near real-time streaming
CN101573976A (en) Transmission device, view environment control device, and view environment control system
CN103890794A (en) Method, apparatus and system for interacting with displays using near field communication (NFC)
JP2005198204A (en) Information distribution display system and information distribution display method
JP4891400B2 (en) Advertisement information delivery / display method, advertisement information delivery / display system, and computer program
EP2422316B1 (en) Using gpu for network packetization
WO2010114512A1 (en) System and method of transmitting display data to a remote display
JP2011034304A (en) Terminal device, coupon attribute change method and coupon attribute change program
AU700680B2 (en) Motion picture reproducing system by bidirectional communication
WO2008061372A1 (en) Digital signage system with wireless displays
JP2007019768A (en) Tag information generating apparatus, tag information generating method, and program
CN113727154A (en) Synchronous playing system, method and device of spliced screen and storage medium
WO2012113460A1 (en) Method and system for combining more than one content data to a single resulting media and making this resulting media available to a content directory service (upnp) or digital media server (dlna)
JP2018055483A (en) Display device, display control method, and program
JP2005341334A (en) Content-reproducing apparatus, computer program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07845581

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07845581

Country of ref document: EP

Kind code of ref document: A1