Saturday 9 July 2011

LG 3D Smartphone

The company launched its LG Optimus 3-D phone Thursday in South Korea after beginning a global release last month covering more than 60 markets including Spain and Britain. No special glasses are required to view 3-D content on the phone.
Park Jong-seok, CEO of LG's mobile communications business, said that 3-D smartphones such as the Optimus can hold their own against dedicated handheld gaming platforms.
"The era of dedicated handheld gaming is over," he said in a release.
Success with the 3-D phone would be welcome for the South Korean company, which has been struggling to overcome weakness in mobile phones. LG's mobile communications business has suffered four straight quarterly operating losses.
LG unveiled the new device, its first 3-D smartphone, earlier this year at the Mobile World Congress trade show in Spain and began rolling it out in global markets last month.
LG said the phone comes pre-installed with three full versions of games from Gameloft including "Asphalt 6: Adrenaline."
"You can actually now play exciting 3D games, advanced 3D games directly on your mobile phone," said Alexandre Tan, Gameloft's director of business development, said at a launch event Thursday in Seoul. "Clearly this is the next big thing for both the gaming and the mobile industry."
The LG Optimus 3-D can record and play back 3-D content. LG is not alone in launching such a phone. Taiwan's top smartphone maker HTC Corp. is also out with its EVO 3-D smartphone.
LG is set to debut the LG Optimus 3D, the world’s first 3D smartphone offering consumers a full 3D experience right in the palm of their hands at the 2011 Mobile World Congress in Barcelona, Spain on Feb. 14.The LG Optimus 3D addresses the lack of 3D content issue -- one of the biggest problems facing the 3D market -- with a complete platform for a one-of-a-kind experience on a mobile device. LG’s most advanced smartphone to date will feature a dual-lens camera for 3D recording, a glasses-free LCD panel for 3D viewing and diverse connectivity options such as HDMI and DLNA for 3D content sharing anytime, anywhere. LG plans to roll out the device worldwide from the second half of the year.

Friday 8 July 2011

StuntMania:Intelligent Helicopters

Scientist at STANFORD have developed an Artificial Intelligence System that enables Robotic Helicopters to teach themselves to fly difficult stunts by watching other helicopters perform the same maneuvers. It can result in development of autonomous helicopter than can perform a complete airshow of complex stunts on its own.

This project is directed under Professor Andrew Ng who directed the research of their graduate students- Pieter Abbeel, Adam Coates, Timothy Hunter and Morgan Quigley. The stunts performed by such intelligent helicopters are far more difficult then any other computer controlled helicopters. They have developed various learning algorithms for these Helicopters which helps them to learn by themselves by just observing other expert helicopters.
The experiment was is an important demonstration of Apprenticeship Learning in which robots learn by observing an expert. Stanford's artificial intelligence system learned how to fly by "watching" the four-foot-long helicopters flown by expert radio control pilot Garett Oku.

This advanced helicopter can learn and perform actions such as traveling flips, rolls, loops with pirouettes, stall-turns with pirouettes, a knife-edge, an Immelmann, a slapper, an inverted tail slide and a hurricane, described as a "fast backward funnel."
Previous autonomous helicopters were able to fly stunts by simply replaying the exact finger movements of an expert pilot using the joy sticks on the helicopter's remote controller. But the major problem was that uncontrollable variables such as gusting winds due to which this is not very advance. To solve this problem, the researchers had Oku and other pilots fly entire airshow routines and every movement of the helicopter was recorded.

This advanced and intelligent contains some instrumentation mounted on the helicopter and some on the ground. These instrumentations monitor the position, direction, orientation, velocity, acceleration and spin of the helicopter in several dimensions. A ground-based computer crunches the data, makes quick calculations and beams new flight directions to the helicopter via radio 20 times per second. Some of the important instruments it uses are Accelerometers, Gyroscopes and Magnetometers.

These advance intelligent helicopters are a new generation of very robust, very reliable helicopter which can fly just as their human counterparts.

NEXI - Robot with facial expressions

By MIT Media Lab is a new robot that is able to show various facial expressions such as 'slanting its eyebrows in anger', or 'raise them in surprise', and show a wide assortment of facial expressions while communicating with people.

This latest achievement in the field of Robotics is named NEXI as it is framed as the next generation robots which is aimed for a range of applications for personal robots and human-robot teamwork.

The head and face of NEXI were designed by Xitome Design which is a innovative designing and development company that specializes in robotic design and development. The expressive robotics started with a neck mechanism sporting 4 degrees of freedom (DoF) at the base, plus pan-tilt-yaw of the head itself. The mechanism has been constructed to time the movements so they mimic human speed. The face of NEXI has been specially designed to use gaze, eyebrows, eyelids and an articulate mandible which helps in expressing a wide range of different emotions
The chassis of NEXI is also advanced. It has been developed by the Laboratory for Perceptual Robotics UMASS (University of Massachusetts), Amherst. This chassis is based on the uBot5 mobile manipulator. The mobile base can balance dynamically on two wheels. The arms of NEXI can pick up a weight of up to 10 pounds and the plastic covering of the chassis can detect any kind of human touch.
This project was headed by Media Lab's Cynthia Breazeal, a well known robotics expert famous for earlier expressive robots such as Kismet. She is an Associate Professor of Media Arts and Sciences at the MIT. She named her new product as an MDS (mobile, dextrous, social) robot.

Except a wide range of facial expressions, Nexi has many other features. It has self-balancing wheels like the Segway transporter, to ultimately ride on. Currently it uses an additional set of supportive wheels to operate as a statically stable platform in its early stage of development. It has hands which can be used to manipulate objects, eyes (video cameras), ears (an array of microphones), and a 3-D infrared camera and laser rangefinder which support real-time tracking of objects, people and voices as well as indoor navigation.

Power from the Air: Device Captures Ambient Electromagnetic Energy to Drive Small Electronic Devices:

Researchers have discovered a way to capture and harness energy transmitted by such sources as radio and television transmitters, cell phone networks and satellite communications systems. By scavenging this ambient energy from the air around us, the technique could provide a new way to power networks of wireless sensors, microprocessors and communications chips.

  
 Manos Tentzeris holds a sensor (left) and an ultra-broadband spiral antenna for wearable energy-scavenging applications. Both were printed on paper using inkjet technology.

"There is a large amount of electromagnetic energy all around us, but nobody has been able to tap into it," said Manos Tentzeris, a professor in the Georgia Tech School of Electrical and Computer Engineering who is leading the research. "We are using an ultra-wideband antenna that lets us exploit a variety of signals in different frequency ranges, giving us greatly increased power-gathering capability."
Tentzeris and his team are using inkjet printers to combine sensors, antennas and energy scavenging capabilities on paper or flexible polymers. The resulting self powered wireless sensors could be used for chemical, biological, heat and stress sensing for defense and industry; radio frequency identification (RFID) tagging for manufacturing and shipping, and monitoring tasks in many fields including communications and power usage.
Communications devices transmit energy in many different frequency ranges, or bands. The team's scavenging devices can capture this energy, convert it from AC to DC, and then store it in capacitors and batteries. The scavenging technology can take advantage presently of frequencies from FM radio to radar, a range spanning 100 megahertz (MHz) to 15 gigahertz (GHz) or higher.
Scavenging experiments utilizing TV bands have already yielded power amounting to hundreds of microwatts, and multi-band systems are expected to generate one milliwatt or more. That amount of power is enough to operate many small electronic devices, including a variety of sensors and microprocessors.
Utilizing ambient electromagnetic energy could also provide a form of system backup. If a battery or a solar-collector/battery package failed completely, scavenged energy could allow the system to transmit a wireless distress signal while also potentially maintaining critical functionalities.
The researchers are utilizing inkjet technology to print these energy scavenging devices on paper or flexible paper-like polymers -- a technique they already using to produce sensors and antennas. The result would be paper-based wireless sensors that are self powered, low cost and able to function independently almost anywhere.

Europe-wide Green eMotion initiative to pave the way for electric vehicles

Within the Green Cars Initiative launched in the context of the European Recovery Plan, the European Union supports research and development of road transport solutions which have the potential to achieve a breakthrough in the use of renewable and non-polluting energy sources. With dwindling fossil resources, electromobility and EV become ever more important, especially with respect to climate change. To this end, the project Green eMotion was selected to enable a mass deployment of electromobility in Europe.
The Irish launch of an EU funded electric vehicle project took place on Tuesday, June 21st, in Dublin. A total of 42 partners involving car manufacturers, energy utilities, universities, and technology and research institutions across Europe are joining forces in the Green eMotion EU Project to advance the use of electric vehicles.
Four of the partners based in Ireland – ESB, Trinity College Dublin, Codema and Cork City Council – will receive €1.5 million in funding out of a total budget of €24 million. The Green eMotion EU Project was launched at Trinity College by the Minister for Communications, Energy and Natural Resources, Mr Pat Rabbitte, who congratulated the Irish partners in taking the lead in a significant EU wide research project.

The four-strong consortium will work together to conduct research and studies into national electric car use and the different technologies that can be deployed to maximize sustainable transport methods.

As well as contributing to the overall objectives, the Irish partners are to develop the design criteria for electric vehicle charging networks, fleet management of electric vehicles and to study the connection and construction techniques for charging points. More advanced charging systems are being developed as part of the project and some of these will be field trialed in Ireland.