Embedded World Illuminates "TRON"

Advances in the embedded space complement those in high-performance computer-graphics animation, such as the innovations shown in the latest “TRON LEGACY” movie.

By John Blyler

Disney’s original science-fiction classic, “TRON,” is credited with ushering in the age of computer-graphics (CG) animation. Today, improved CG animation quality makes it difficult for viewers to distinguish live-action elements from CG-generated ones, as in “Avatar.” In the recent sequel to “TRON,” called “TRON LEGACY,” CG animation was used to create a convincing younger version of the now-older protagonist, Kevin Flynn (played by Jeff Bridges).

The advances in CG filmmaking reflect the marvels of 30 years of advances in semiconductor high-end, PC-server hardware and computer programming technology. What have gone unnoticed have been equal advances made in the world of embedded systems. Let’s look at just one aspect of the improvements brought by the embedded and sensor technology revolution: illumination control.

Virtual Light
By today’s standards, the original “TRON” movie had very little in the way of computer effects. At the time, it was state-of-the-art. Yet many of the computer-generated scenes looked little better than the graphic-limited video games of the day. The suits in “TRON” weren’t illuminated. The “TRON look” was achieved by shooting the movie normally in black and white and then enlarging every frame to 8-x-10, black-and-white film positive cells for rotoscoping. This technique allowed animators to trace over live-action film movement, frame by frame, for use in animated films.

According to Alan McFarland, CTO for Nila—specialists in film and television lighting—the “TRON” animators would then colorize each one of those film cells by hand, using tint dyes and airbrushing and so forth. Each cell was then re-photographed on a backlit animation stand with colored gels. Selective double-exposure techniques gave everything that was supposed to glow its particular aura. Making a feature-length film in that fashion was very expensive by the standards of the day and would be prohibitive in today’s world.

“For ‘TRON LEGACY,’ the suits actually illuminated on their own,” explains McFarland (see Figure 1). “Motion-picture cameras have more than enough sensitivity to capture the illumination as is, making digital rotoscoping of the suits in post-production mostly unnecessary.”


Most suits also contained a detachable Identity Disc or Light Disc that was a key element of the story. Each disc contained all of the memories—everything seen, heard, or experienced—by the anthropomorphized programs in the virtual world of “TRON.” Although a program’s glowing disk could be detached from the body to use as a weapon, it was normally mounted on the user’s upper back.

Inverting Power
Light-emitting diodes (LED)-based light discs and self-illuminating suits need power and control—even in the virtual world of “TRON.” This is where the use of tiny, low-power embedded systems can win out over the more costly expense of post-production CG animation.

The light discs used LEDs controlled by Xbee modules for lighting, recalls McFarland. “The only time that Nila controlled the disc lighting was when the disc was attached to the costume. The studio’s prop department handled the disc lighting when it was in the actor’s hand.”

The suits were another matter, since they had to be flexible and tailored to the shape of the actor. To achieve flexibility, each suit was illuminated with custom electroluminescent (EL) material that could be shaped into various patterns. McFarland points out that some of the suits, such as Sam Flynn’s costume and Clu’s outfit, had more than 50 individual pieces of EL material. That material totaled some 1100 square inches per costume–-almost half of the entire area of a typical human body.




The EL material required 290 V AC at 1100 Hz—albeit at very low current. This unusual power requirement necessitated the creation of a custom 150-W inverter to convert the direct current (DC) from the battery pack into alternating current (AC). Sam’s costume required two of these inverters while Kevin Flynn’s costume—with fewer but much larger pieces of electroluminescent material—required four inverters. The guards, Quora, and the other suits typically only needed one of the custom inverters.

These inverters were usually located in the Identity Disc hub, which was mounted to the back of the suit (see Figure 2). The available space inside the disc hub was about the same volume as a softball. Each disc contained at least two 150-W EL inverters plus a daughtercard for the wireless-network lighting control and monitoring module. Power for this embedded system was supplied by batteries that were typically located on the waist of the actor and disguised to look like part of the costume.

The new, super-high-energy-density batteries were developed especially for the Tesla Roadster. On the Sam Flynn and Clu costumes, these batteries provided about 11 min. of runtime. They could be fully recharged in 15 min.

The limited disc space meant that all of the electronic components had to be as small as possible. Unfortunately, smaller inverters generate more heat, which limited the maximum runtime of a single movie take to about 8 min. before the suit would overheat, notes McFarland. “Usually, this wasn’t a problem, as Joe Kosinski--the director--set up shots that ran considerably shorter.”

Keeping Their Cool
Heat-generating inverters reduce the performance of electronics as well as the actors that wear them. How did the performers, already overheating in rubberized fabric suits, keep their cool?

Coolness was achieved through the use of four 20-ton air conditioners, which chilled the set down to about 40 degrees. As McFarland remembers, “Those of us off-camera had to wear parkas to keep warm on the set.” To further ensure that the electronics and actors didn’t overheat, he used thermal epoxy to attach a National Semiconductor LM34 thermistor temperature sensor to the main inductor on the inverters. Temperature data was input to the wireless monitoring-control system module’s analog-to-digital converters (ADCs) before digitizing and transmission to the offset control computer for display.

Writing the code for temperature monitoring was also pretty cool. “Only one line of code was needed to read temperature from the analog-to-digital converter,” notes Wade Patterson, CEO of Synapse. Because the wireless monitoring system was bi-directional, it could be changed on the fly if it wasn’t working as desired. This ability to read sensor values quickly is important—especially as the number of nodes increases to the hundreds.

Wireless Control and Monitoring
An integral part of the embedded system used in “TRON LEGACY” was the wireless-network lighting control and monitoring module, which was developed by Synapse Wireless Inc. By using the company’s “SNAP” network, Nila’s McFarland was able to turn the suit lighting on and off instantly. Furthermore, the SNAP wireless software returned data to the control computer screen, showing battery levels, runtime, and inverter temperature. This real-time data enabled the movie’s director to maximize the use of special effects by monitoring the suit battery life and inverter temperature.

Although 104 suits were built to incorporate the wireless monitor-controller system, no more than 25 were ever used on the same day and far fewer used simultaneously. This was not a limitation of the technology, but rather a reality of the demands for shooting a movie. “I doubt I would’ve survived the show if we’d had any days that intense (requiring 104 suits to be controlled simultaneously),” said McFarland.




An appropriately phrased “Sleepy Mesh” state in the SNAP network was used to awaken the radio frequency (RF) transceivers on the nodes as needed. “Sleepy Mesh” didn’t merely put one node to sleep but could issue a network-wide sleep scenario. (Think of the first encounter with the Borg on Star Trek.) The power savings from having the entire network sleeping was significant, far greater than available with a traditional mesh network. In fact, each node’s battery life was extended up to the shelf life of the battery.

Each Synapse system contained eight (8) ADC inputs, 10 to 20 digital-output interface control ports, and a low-power, 2.4-GHz, IEEE 802.15.4 personal-area-network (PAN) RF module. Data from each wireless module (see Figure 3) can use AES-128 encryption. “You can even get the SNAP software on a microcontroller the size of a fingernail,” explained Wade Patterson, CEO of Synapse.” We have one customer that has embedded such modules into clothing.”

Embedded Hollywood
How did Hollywood learn about Synapse Wireless, a small but growing, Alabama-based wireless control and monitoring company? That’s where Nila fits into our story.

Nila’s Alan McFarland was tasked with engineering the illumination and control of the suits for “TRON LEGACY.” McFarland had seen a Synapse technology demonstration in which the SNAP modules and language were used to radio-control a small tank. After comparing products by Xbee, Red Pine, and W-DMX of Sweden, only the Synapse module met the requirements of limited space, bi-directional communication, and ready access to technical support. Synapse engineers David Ewing and Mark Guagenti helped McFarland throughout the production of “TRON LEGACY” with both the development of Python-based SNAP software and engineering support.

Testament to the ease-of-use feature of the modules is that the code to control the suit lighting was developed in less than two weeks, according to Patterson. The coding was relatively easy, thanks to a Python virtual environment that separates the application development from the underlying network-protocol details. (The combination of SNAP and Python is referred to as SNAPpy.) “End-user wireless applications are compiled into processor-independent ’byte code‘ that is run on the virtual machine. This means that the same application can be run on any processor without the need for recompilation,” explained Patterson.

Virtual machines running on wirelessly connected embedded processors? Doesn’t that sound vaguely like the world of cyberspace in “TRON?” Perhaps the writers of “TRON LEGACY” should’ve included at least one snappily attired program element in the movie to acknowledge the advances in the real world of embedded systems.



John Blyler is the editor in chief of Chip Design and Embedded Intel magazines and the editorial director of Extension Media. John has co-authored several books on technology (Wiley and Elsevier). He has over 23 years systems engineering hardware-software experience in the electronics industry. John remains an affiliate professor in Systems Engineering at Portland State University. Mr. Blyler holds a BS in Engineering Physics from Oregon State University, as well as a MSEE from California State University, Northridge.




Alan McFarland has been doing special-effects lighting on Hollywood movies for over 25 years, starting off in miniature FX lighting for movies like “The Hunt for Red October,” “Speed,” and “The Fifth Element.” He also has years of experience with lighting effects on costumes, having designed the lighting for the Borg suits for “Star Trek: First Contact,” Robin Williams’ robot suit for “Bicentennial Man,” and the blue-glowing motion-capture suit worn by Billy Crudup as Dr. Manhattan in “Watchmen.”




Wade Patterson is responsible for Synapse’s corporate vision, intellectual property, product strategy, and operational execution of the company. Patterson is the former president and CEO of Intergraph Corp.’s worldwide computer business. Prior to this, he was vice-president of engineering for Intergraph. Patterson is a distinguished fellow of the Mississippi State University College of Engineering and holds a BS in electrical engineering. He is a named inventor on 18 U.S. Patents.