mercoledì 27 maggio 2015

Quantum computer emulated by a classical system.

Drs. Granville Ott (left) and Brian La Cour (center) with student Michael Starkey (right) beside their prototype quantum emulation device. Credit Applied Research Laboratories, The University of Texas at Austin
Source: Phys.org
--------------------
Quantum computers are inherently different from their classical counterparts because they involve quantum phenomena, such as superposition and entanglement, which do not exist in classical digital computers. But in a new paper, physicists have shown that a classical analog computer can be used to emulate a quantum computer, along with quantum superposition and entanglement, with the result that the fully classical system behaves like a true quantum computer.
Physicist Brian La Cour and electrical engineer Granville Ott at Applied Research Laboratories, The University of Texas at Austin (ARL:UT), have published a paper on the classical emulation of a quantum computer in a recent issue of The New Journal of Physics. Besides having fundamental interest, using classical systems to emulate quantum computers could have practical advantages, since such quantum emulation devices would be easier to build and more robust to decoherence compared with true quantum computers.
"We hope that this work removes some of the mystery and 'weirdness' associated with quantum computing by providing a concrete, classical analog," La Cour told Phys.org. "The insights gained should help develop exciting new technology in both classical analog computing and true quantum computing."
As La Cour and Ott explain, quantum computers have been simulated in the past using software on a classical computer, but these simulations are merely numerical representations of the quantum computer's operations. In contrast, emulating a quantum computer involves physically representing the qubit structure and displaying actual quantum behavior. One key quantum behavior that can be emulated, but not simulated, is parallelism. Parallelism allows for multiple operations on the data to be performed simultaneously—a trait that arises from and entanglement, and enables quantum computers to operate at very fast speeds.
To emulate a quantum computer, the physicists' approach uses electronic signals to represent qubits, in which a qubit's state is encoded in the amplitudes and frequencies of the signals in a complex mathematical way. Although the scientists use electronic signals, they explain that any kind of signal, such as acoustic and electromagnetic waves, would also work.
Even though this classical system emulates quantum phenomena and behaves like a quantum computer, the scientists emphasize that it is still considered to be classical and not quantum. "This is an important point," La Cour explained. "Superposition is a property of waves adding coherently, a phenomenon that is exhibited by many classical systems, including ours.
"Entanglement is a more subtle issue," he continued, describing entanglement as a "purely mathematical property of waves."
"Since our classical signals are described by the same mathematics as a true quantum system, they can exhibit these same properties."
He added that this kind of entanglement does not violate Bell's inequality, which is a widely used way to test for entanglement.
"Entanglement as a statistical phenomenon, as exhibited by such things as violations of Bell's inequality, is rather a different beast," La Cour explained. "We believe that, by adding an emulation of quantum noise to the signal, our device would be capable of exhibiting this type of entanglement as well, as described in another recent publication."
In the current paper, La Cour and Ott describe how their system can be constructed using basic analog electronic components, and that the biggest challenge is to fit a large number of these components on a single integrated circuit in order to represent as many qubits as possible. Considering that today's best semiconductor technology can fit more than a billion transistors on an integrated circuit, the scientists estimate that this transistor density corresponds to about 30 qubits. An increase in transistor density of a factor of 1000, which according to Moore's law may be achieved in the next 20 to 30 years, would correspond to 40 qubits.
This 40-qubit limit is also enforced by a second, more fundamental restriction, which arises from the bandwidth of the signal. The scientists estimate that a signal duration of a reasonable 10 seconds can accommodate 40 qubits; increasing the duration to 10 hours would only increase this to 50 qubits, and a one-year duration would only accommodate 60 qubits. Due to this scaling behavior, the physicists even calculated that a signal duration of the approximate age of the universe (13.77 billion years) could accommodate about 95 qubits, while that of the Planck time scale (10-43 seconds) would correspond to 176 qubits.
Considering that thousands of qubits are needed for some complex tasks, such as certain encryption techniques, this scheme clearly faces some insurmountable limits. Nevertheless, the scientists note that 40 qubits is still sufficient for some low-qubit applications, such as quantum simulations. Because the quantum emulation device offers practical advantages over quantum computers and performance advantages over most classical computers, it could one day prove very useful. For now, the next step will be building the device.
"Efforts are currently underway to build a two-qubit prototype device capable of demonstrating ," La Cour said. "The enclosed photo [see above] shows the current quantum emulation device as a lovely assortment of breadboarded electronics put together by one of my students, Mr. Michael Starkey. We are hoping to get future funding to support the development of an actual chip. Leveraging quantum parallelism, we believe that a coprocessor with as few as 10 could rival the performance of a modern Intel Core at certain computational tasks. Fault tolerance is another important issue that we studying. Due to the similarities in mathematical structure, we believe the same quantum error correction algorithms used to make quantum computers fault tolerant could be used for our quantum emulation device as well."

sabato 18 giugno 2011

The Internet of Things: Toolbox to Help Objects Communicating Via the Net.

Source: ScienceDaily

ScienceDaily (June 17, 2011) — Increasingly, the things people use on a daily basis can be connected to the Internet. An alarm clock not only rings, but can also switch on the coffee machine while turning on the light. But what is needed to ensure that the Internet of Things operates as efficiently as possible? Thus far, the Internet has been an arena reserved for people. But now more and more physical objects are being connected to the Internet: we read emails on our mobile telephones, we have electricity meters that report readings automatically, and pulse monitors and running shoes that publish information about our daily jog directly on Facebook.
Tools for collaboration The Internet of Things will introduce new smart objects to our homes. One challenge is to find effective solutions to enable different products to work together. Currently no standardised tools or distribution platforms exist in this area.
A group of Norwegian researchers have been addressing this issue. In the research project Infrastructure for Integrated Services (ISIS) they have created a platform for developing and distributing applications for the Internet of Things. The platform encompasses a programming tool for developers, called Arctis and the website ISIS Store for downloading applications. The project has received funding from the Research Council of Norway's Large-scale Programme VERDIKT.
Simple programming
Arctis was developed by researchers at the Norwegian University of Science and Technology (NTNU). One of them is postdoctoral researcher Frank Alexander Kraemer.
"In a 'smart' everyday life objects and applications often need to be connected to several different communication services, sensors and other components. At the same time they need to respond quickly to changes and the actions of users. This requires very good control over concurrence in the system, which can be difficult to achieve with normal programming," he explains.
Dr Kraemer believes that the tool will make it easier to create new applications, adapt them to existing applications and update software as necessary.
"Developing a simple application with Arctis can be as easy as fitting together two building blocks, but more advanced applications can also be created, depending on what you are looking for," Dr Kraemer continues.
Talking to each other
"It is the collaborative system ICE Composition Engine (ICE) that will govern the whole thing and allow the objects to talk to each other," explains Reidar Martin Svendsen, project manager at the Norwegian telecommunications company the Telenor Group.
ICE can both manage the communication between objects in your home and keep track of any updates. The system is installed on a modem, a decoder or an adapter in the home and provides the user with a local gateway which ensures that the Internet of Things will continue to work even when the user is offline.
Key developers
Telenor is seeking to become an operator for the Internet of Things by acting as a link between developers and end-users. But if the company is to succeed, a sufficient number of developers will need to choose to use its tools.
"We have established our own App Store where talented developers can publish the new applications they create and end-users can buy and download the applications they need. Basically, you can choose software according to your own needs and preferences," says Mr Svendsen.
The downloaded applications can be combined as needed using a software programme called Puzzle. The Puzzle programme is a user interface to the ICE system.
Safe connections
For the project to flourish, people have to be willing to pay for the applications. There are already many similar applications available online free-of-charge through the data infrastructure platform Pachube, for example. Why are users going to pay for something they can download legally and at no cost?
"It is better if a well-known operator is responsible for critical systems such as house alarms. For these types of systems you should go via the App Store to a supplier you trust. You don't know anything about the intentions of those who put out programmes free-of-charge on the Internet. But if your system needs updating or you require a service, it is an advantage to be using a reputable, recognised operator," explains Mr Svendsen.
"On the whole it will be up to the developers to decide what to charge for. At the ISIS Store there are currently a number of applications available that can be downloaded free-of-charge," he continues.

Story Source:
The above story is reprinted (with editorial adaptations by ScienceDaily staff) from materials provided by
The Research Council of Norway. The original article was written by Geir Aas/Else Lie; translation by Anna Godson/Carol B. Eckmann.

martedì 12 gennaio 2010

Faster and More Efficient Software for the US Air Force.

Source: ScienceDaily
------------------------
ScienceDaily (Jan. 12, 2010) — Researchers at the University of Nebraska in Lincoln have addressed the issue of faulty software by developing an algorithm and open source tool that is 300 times faster at generating tests and also reduces current software testing time.
The new algorithm has potential to increase the efficiency of the software testing process across systems.
The project, funded in part by an Air Force Office of Scientific Research (AFOSR) Young Investigator Award and through a National Science Foundation Early CAREER Award, is of particular interest to the military because of the potential to reduce errors in theater. This technology will also be helpful to the private sector where some agencies are reporting financial losses of up to 50 billion dollars per year because of poor software.
"Software failures have the potential to cause financial, environmental or bodily harm," said lead researcher, Dr. Myra Cohen. "Our techniques will help to improve the quality of software in the military to help ensure that those systems behave properly in the field."
"The ultimate goal of research like this is not just to reduce software testing costs, but to do so while maintaining or even increasing confidence in the tests themselves," said AFOSR Program Manager, Dr. David Luginbuhl who is overseeing Cohen's work.
"Although algorithms exist that can produce samples for testing, few can handle dependencies between features well. Either they run slowly or they select very large test schedules, which means that testing takes too long," said Cohen.
Her project, called "Just Enough Testing" aims to re-use test results across different systems that share similar sets of features so the time to test a single system is reduced.
Large and complex families of software systems are common, and within them, groups of interacting features may cause faults to occur. The scientists have examined ways to ensure that faults are found earlier and more often in these types of systems.
"In the long term, we expect that as software product lines are used to produce large numbers of systems, and as they mature over time, we will be able to deploy new systems faster and with less likelihood of failure," she said.
Story Source:
Adapted from materials provided by
Air Force Office of Scientific Research.

'Wet' Computing Systems to Boost Processing Power.

Source: ScienceDaily
---------------------------
ScienceDaily (Jan. 12, 2010) — A new kind of information processing technology inspired by chemical processes in living systems is being developed by researchers at the University of Southampton.
Dr Maurits de Planque and Dr Klaus-Peter Zauner at the University's School of Electronics and Computer Science (ECS) are working on a project which has just received €1.8 from the European Union's Future and Emerging Technologies (FET) Proactive Initiatives, which recognises ground-breaking work which has already demonstrated important potential.
The researchers, Dr de Planque, a biochemist, and Dr Zauner, a computer scientist, will adapt brain processes to a 'wet' information processing scenario by setting up chemicals in a tube which behave like the transistors in a computer chip
"What we are developing here is a very crude, minimal liquid brain and the final computer will be 'wet' just like our brain," said Dr Zauner. "People realise now that the best information processes we have are in our heads and as we are increasingly finding that silicon has its limitations in terms of information processing, we need to explore other approaches, which is exactly what we are doing here."
The project, entitled Artificial Wet Neuronal Networks from Compartmentalised Excitable Chemical Material, which is being co-ordinated by Friedrich Schiller University Jena with other project partners, the University of the West of England, Bristol and the Institute of Physical Chemistry, Polish Academy of Sciences, Warsaw, will run for three years and involves three complementary objectives.
The first is to engineer lipid-coated water droplets, inspired by biological cells, containing an excitable chemical medium and then to connect the droplets into networks in which they can communicate through chemical signals. The second objective is to design information-processing architectures based on the droplets and to demonstrate purposeful information processing in droplet architectures. The third objective is to establish and explore the potential and limitations of droplet architectures.
"Our system will copy some key features of neuronal pathways in the brain and will be capable of excitation, self-repair and self-assembly," said Dr de Planque.
Story Source:
Adapted from materials provided by
University of Southampton, via AlphaGalileo.

New multi-touch screen technology developed (with Video)

Source: Physorg.com

Scientists from New York University have formed a company to bring flexible multi-touch screens using a new technology to a range of devices, from e-readers to musical instruments. The new touch screens respond to all kinds of objects, as well as fingers and hands.

The team, led by Ken Perlin and Ilya Rosenberg from the Media Research Laboratory, formed their company Touchco to develop IFSR (interpolating force-sensitive resistance) technology, which uses resistors sensitive to the force or pressure applied to touch points. This, along with and positional interpolation, allows for (theoretically) unlimited simultaneous touch inputs (in contrast to other touch technologies such as the capacitive used by , which can track limited touch points). It has low power requirements and is inexpensive, since Touchco expects to sell the screen material at $10 per square foot. The technology is easily scalable for use in small or large devices.

Perlin said that the IFSR technology is likely to appear in a new range of e-readers later this year, and will also be used in laptops or notebooks, and new types of musical instruments. In computer applications the touch pads allow the user to control the with a light touch, but to select and manipulate objects when more pressure is applied.
The company has also been collaborating with Disney to develop a digital sketchbook that will use sensitive pressure sensors capable of differentiating between the touch of a hand, a pencil, brush or eraser. Drawing pictures is just one of many possible applications of the technology.

The touch pads consist of layers of FSR ink sandwiched between opaque or transparent sheets of plastic onto which conductive wires are printed. The total thickness is only 0.25 mm. When pressure is applied to an FSR sensor a current flows from the wires in one layer to the wires in the other layer, and the pressure applied is determined by the amount of electric current flowing from layer to layer. Arranging sensors in a grid would have been expensive and impractical, so Touchco developed scanners to measure the touches and determine their positions. Resolution is 254 µm (100 dpi), but this can be increased by decreasing the wire spacing.
Touchco has already begun selling developer kits to device manufacturers and expects to see IFSR technology finding many applications during 2010.

More information: Touchco: http://touchco.com/

Stanford Univesity: We're emulating the brain ...in silicon!

Source: Stanford University/Bioengineering Department

Welcome to Brains in Silicon. Learn about the lab, get to know the brains that work here, and find out about new projects that you could join. We have crafted two complementary objectives: To use existing knowledge of brain function in designing an affordable supercomputer—one that can itself serve as a tool to investigate brain function—feeding back and contributing to a fundamental, biological understanding of how the brain works.We model brains using an approach far more efficient than software simulation: We emulate the flow of ions directly with the flow of electrons—don't worry, on the outside it looks just like software.Welcome and enjoy your time here!