DIGINFO TV

DIGINFO TV's picture

DigInfo TV is a Tokyo-based online video news site, founded in 2007. We are dedicated to the original coverage of cutting edge technology, research and products from Japan.Our news is available in both Japanese and English. We are proud to be an official YouTube content partner. Our English and Japanese channels combined have over 90,000 subscribers, receive approximately 80,000 views daily and have over 120 million views in total. DigInfo TV is the news production arm of Digitized Information, a translation and video production company.Our original videos can be seen embedded all over the web, from technology blogs to major news sites, with articles written about our news coverage in dozens of languages worldwide.

DIGINFO TV - March 24, 2014

Job in Yokohama offers the PORTA series of portable X-ray units, with five models: Two for human applications and three for animal applications. The company started developing these products in 1997, and released the first one commercially in 2006. These X-ray units are compact, lightweight, and durable; they're sold mainly overseas. Currently, they're manufactured at Job's plant in Yokohama.

Keystone Technology's LED vegetable garden system is a cultivation system for indoor plant factories which uses LED lighting instead of sunlight. The most defining feature of the system on display at the company's showroom in Yokohama is its 3-dimensional use of space.

Akihabaranews.com - Diginfo - Kosmek robotic hand changer

KOSMEK has developed a robotic hand changer that can switch between robot tools automatically - and with high precision. 

 

Researchers have developed a system that can show in 3D the complicated heart structures of babies with congenital heart conditions. The pictures are easy to understand, and can be rendered quickly.

 

Tensegrity structures are created using isolated struts held together with cables. By utilizing the tensile strength of the cables to place a compressive force on the struts, the structures feature high structural integrity and mechanical stability.

This is an AR support system for constructing objects with a tensegrity structure, created by the Hirasawa Lab at Chiba University.

"We liked this shape a lot. But things like this are difficult to design and build. So, we're using AR to support their construction."

Associate Professor Toshiaki Tsuji's Laboratory at Saitama University has developed R-cloud, a rehabilitation support robot that enables users to view how their own muscles move during rehabilitation and training.

 

At the University of Tokyo, the Hirose-Tanikawa Lab is doing R&D on a system that virtually creates and affects people's emotional states.

 

2x3D, developed by the Shirai Lab at Kanagawa Institute of Technology, is a system that lets the same screen be viewed simultaneously by people who want to watch it in 2D and 3D.

Whereas conventional passive 3D systems use polarizing filters for both the left and right eyes, this system uses a special picture-generating algorithm. Pictures for the left eye can be seen with the naked eye and only pictures for the right eye need to be viewed through a polarizing filter.

 

The Igarashi Lab. at the University of Tokyo is developing a new type of physics engine which can faithfully recreate in 3D the stylizations common to 2D anime.

"Physics engines are used in 3D animation, to generate the motions of hair and clothes. They can do physically correct computing, but the problem is, that's all they can do. The engine we've created makes it possible for artists to adjust such physics computations, in other words, to deform things."

Microtech Laboratory exhibited the ultra-small rotary encoder MES-6-125PST16C, a device that detects at high accuracy the rotational angle and speed of mechanical devices and motors, at International Robot Exhibition 2013.

"Typical rotary encoders are large industrial models mostly about palm-size, but this product is very small, able to fit on a fingertip."

DIGINFO TV - November 14, 2013

This is the teamLabBody, co-developed by teamLab and Professor Sugamoto, in Osaka University's department of orthopedics. It is a 3D human anatomy app, based on the joints and natural range of movements of the human body.

 

The Hairlytop Interface is an interactive surface display made of soft solid-state actuators and optical sensors which react to light. Jointly developed by the University of Electro-Communications and Symphodia Phil, when placed on an iPad that is playing a video, it moves organically, like a living thing, in response to changes in the brightness of the screen.

"One feature of this system is that the motion is very cute, like that of an animal. Another feature is that it can be used extremely freely in terms of design."

 

This is an interactive optic fiber cloth. It is made of side-emitting diffusive optic fibers so computer controlled light patterns can be displayed on it, and by attaching light receivers as well as light emitters, optical signals can be input using infrared light.

"Originally, we came across the light-emitting fabric, where light is emitted from the optical fiber surface. Conversely, we thought it could be used to input data by receiving light through the surface. So, we've created this kind of interface."

 

Opect is an interface developed to enable surgeons to interact directly with images and data in an operating room environment. It has been developed at the Institute of Advanced Bioengineering and Sciences at the Tokyo Women's Medical University.

 

This mixed reality interface places virtual characters in the real world. It was developed by the Naemura Lab at the University of Tokyo.

Users can have an animated character jump onto their hand, as well as guide the character onto blocks, creating a novel interactive experience.

"Recently, devices have been developed that can form images in mid-air. We've utilized one of those, and combined it with sensors and a projector, to provide an intuitive display experience where a picture in the air is skillfully merged with the real world."

 

This system synchronizes video tracks with an independent audio source in real-time. It has been developed by researchers at Waseda University.

"Ordinarily, to play video and audio content in sync, some type of synchronization signal is necessary. But this system doesn't need anything like that as it can synchronize video by using sound played in the same space."

 

SEL have developed a range of flexible OLED displays and high PPI LCD panels which use CAAC oxide semiconductors.

"CAAC stands for C-Axis Aligned Crystal. In this material's structure, the crystals are aligned in the c-axis direction. Because CAAC itself is crystalline instead of amorphous, it has much higher reliability. Until now, with oxide semiconductors, reliability was generally thought to be a problem, but using this material solves that problem."

 

This AquaTop display is a touchscreen display for your bath.

The display is projected onto a bath filled with water mixed with bath salts, and a Kinect is used to detect interaction. It can recognise individual fingers sticking out to of the bath by 1.4 cm or more as well as interactions from above the surface of the water. It is being developed by the Koike Lab. at the University of Electro-Communications.

 

The neurocam is the world's first wearable camera system that automatically records what interests you.

It consists of a headset with a brain-wave sensor and connects to an iPhone. The system estimates whether you're interested in something from the brain-waves captured by the sensor, and uses the iPhone's camera to record the scenes that appear to interest you.

Just buy wearing the neurocam, the scenes you have shown an interest in are recorded automatically, and are added to an album which you can look through later.

Pages