lunedì 23 marzo 2009

Software Fits Flexible Components


ScienceDaily (Mar. 23, 2009) — Can the newly designed dashboard be easily installed? What paths should the assembly robot take so that the cables do not hit against the car body? A new software program simulates assembly paths and also factors in the pliability of components.
Car component designers not only have to ensure that their designs are visually appealing, they also have to think about the assembly process: Can the designed dashboard be easily installed in the new car model? What assembly paths need to be taken so that the component does not hit and scratch the car body? Thanks to a new software program, components that only exist in the form of CAD data can be virtually installed in the new car model by the assembly planners. If a component is too large to be maneuvered into place, the program gives concrete advice on where to change its shape.
The software was developed and has now been further improved by researchers at the Fraunhofer-Chalmers Research Centre for Industrial Mathematics FCC in Gothenburg, Sweden, and the Fraunhofer Institute for Industrial Mathematics ITWM in Kaiserslautern. “We can also include the pliability of components in the assembly simulation,” says ITWM group manager Dr.-Ing. Joachim Linn. “In the CAD data, flexible components such as plastic parts for the passenger compartment appear rigid, but during assembly they have to be slightly bent and pressed.”
How much force needs to be applied to bend the dashboard far enough to install it in the car? Can the job be done by just one employee and are special tools required? How can flexible brake hoses be installed most efficiently? The researchers also simulate the use of assembly robots, whose flexible supply lines often scrape against the car body, leaving small scratches. The program computes how the robot should move and fit the parts so that the cables do not hit the bodywork.
These computations are fast – like the CAD programs the designers are used to. “You can work interactively with the program, for example to make a component longer or shorter in just a few seconds. For this purpose we slimmed down the highly accurate structure-mechanical computation processes. The results are still accurate enough but are delivered in real time,” says Linn. Assembly paths, too, are computed within minutes. The researchers will give a live demonstration of the program at the Hannover-Messe (Hall 17, Stand D60) from April 20 to 24. The software is due to be launched on the market before the end of the year; support services and training material are already available.
Adapted from materials provided by Fraunhofer-Gesellschaft, via AlphaGalileo.

martedì 17 marzo 2009

Fuzzy Logic And Grey Science

ScienceDaily (Mar. 17, 2009) — If something is true it cannot be false, and if something is false it cannot be true. The same can be said of black and white. This principle of classical logic is one of the mainstays of the scientific method, but it loses effectiveness in the grey areas between these absolutes.
In other words, classical logic cannot adequately quantify elements that are black but also white, and which can only be differentiated by applying arbitrary distinctions of shades such as light grey, dark grey, or very dark grey. How many grains of sand can a desert lose before we cease to consider it a desert? How many degrees does the temperature in a room have to drop for it to be cold?
Fuzzy logic provides a channel for dealing scientifically with these qualitative concepts. “It is an extension of classical logic used to quantify vagueness”, explains Eduard Alarcón, a lecturer in the Department of Electronic Engineering at the UPC.
The concept of fuzzy logic derives from an article called “Fuzzy sets” published in 1965 by the engineer Lofti A. Zadeh. Eduard Alarcón explains that, “in classical logic, according to Bertrand Russell, from a set of antecedents is derived a set of consequents”. The system is based on a combination of rules taking the form “if” (antecedent), followed by “then” (consequent). “In fuzzy logic, both the antecedents and the consequents are fuzzy sets, which aim to quantify the vagueness of the qualifiers”, says Alarcón.
Joan Domingo, of the Department of Automatic Control, explains how fuzzy sets are used in a lighting control system. The system parameters, he explains, are a set of between three and six adjectives that describe the light intensity as, for example, “very low, low, sufficient, high and very high”. A series of light readings are taken, and a value is assigned to indicate the degree to which each reading corresponds to each of the adjectives. Thus, a very weak light could be assigned a value of 0.6 for “very low”, 0.3 could be “low”, and 0.1 “sufficient”. Each measurement is split between different sets, and the degree to which it corresponds to each one is represented by a function.
The system then applies a series of rules of the type “if ... then ...”, which are defined by an expert or directly integrated into the system. Thus, a rule might be, “if the light is very low, then we need to apply very high lighting”. These rules are applied to the input data using a chip- or algorithm-based inference motor. The output of each inference rule is an area, and the area of intersection of all the outputs is the final result, which is then translated into an action over the physical environment to which the system is applied.
These expert systems are based on rules that apply fuzzy logic, and have been integrated into electrical appliances, cameras, air-conditioning systems, industrial control systems and information technology in the last few decades.
More efficient appliances
Fuzzy systems also have a range of domestic applications, such as a washing machine that uses less detergent and water for lighter loads, or a control system for maintaining a comfortable temperature without switching on or off each time the thermostat registers a certain value, and without creating sudden rises or drops in temperature. This smooth transition between temperatures is achieved thanks to the degrees and zones used by fuzzy sets. Eduard Alarcón explains that fuzzy sets are analytical mathematical sets that provide a graded description referring to a specific zone. Each zone is independent of the others, and is therefore responsible for a single control action. By processing input values as degrees of zones, fuzzy systems can interpolate the results of different zones, thus ensuring that the resulting action is less abrupt.
Joan Domingo emphasizes the decision-making speed of inference motors which, unlike conventional processors, work in millions of fuzzy logic inferences per second (MFLIPS). “Classical control systems are analogue and slow, whereas fuzzy systems are digital and run at incredibly high speeds. Fuzzy systems are very easy to use, relatively quick to install, and produce excellent results”, he explains.
In addition to the development of new systems and devices, theoretical advances are also being made. For example, the Research Group on Functional Mathematical Modeling and Applications is currently studying improved fuzzy classification methods for solving problems caused by linguistic ambiguity in the definition of adjectives and imprecise measurement devices.
Control systems are the key product of fuzzy logic in the field automation and control, but UPC researchers are also incorporating fuzzy logic into the design of systems that will improve the quality of life of users in other areas, such as aeronautics and medical image analysis.
Detecting cancer cells
Can cancer cells be detected in a uterine tissue sample? Until recently, to answer this question a pathologist would have had to spend hours over the microscope examining a cytology sample of uterine tissue provided by a gynecologist. Samples of this size contain millions of cells, and prolonged examination can lead to tiredness and increased probability of error.
To minimize this risk, a joint research team from the UPC and Rovira i Virgili University, directed by Pilar Sobrevilla, of the Department of Applied Mathematics II, and Eduard Montseny, of the Department of Automatic Control, is working with the Hospital de Sant Pau to develop an automatic cytology image analysis system. Pilar Sobrevilla explains that “the system, which is based on fuzzy logic, isolates all of the cells in the image and determines their degree of normality”.
The system examines the image and determines the possible presence of abnormal cells on the basis of color and texture. The result is then displayed in a new image that highlights the areas containing cancerous cells. The process can be performed in real time, as the pathologist simply has to feed the original image into the device to obtain a modified version highlighting the areas that need to be examined more thoroughly.
The system is already used by the Hospital de Sant Pau, and researchers are working on a second phase of the project that, explains Sobrevilla, will enable users to pinpoint potentially affected areas with greater precision and obtain data that can be tailored to the specific information required by the doctor.
The same research group has also designed a system which uses a fuzzy logic algorithm to grade the quality of corneal tissue used in transplants, the first version of which is already used by the Tissue Bank of the Hospital de Sant Pau in Barcelona. The system analyzes images of corneal tissue to determine whether it is of suitable quality for transplantation, and factors in data on the general health of potential donors to determine whether they meet the relevant health requirements.
Fuzzy logic can also be used to model the noise pollution produced by planes at take-off and landing. Xavier Prats, of the Department of Mechanical Engineering, takes the idea a step further in his doctoral thesis, which describes a system for defining the optimal trajectory during these maneuvers to minimize the noise pollution affecting local residents. Joseba Quevedo, the joint director of the thesis with Vincenç Puig, explains that new satellite navigation systems have made it possible for planes to follow curved trajectories during take-off and landing, rather than the straight ascents and descents required previously.
By altering the trajectory, it is also possible to reduce the noise pollution suffered by those living close to airports, the severity of which depends on the tolerance of individuals, and is not an objective parameter such as the level of sound produced. As Quevedo explains, “A patient in a hospital ward does not hear things in the same way as a young person shopping in a market, and the perceived level of a particular sound can vary enormously between day-time and night-time due to changes in the acoustic level of the surroundings”. The system uses fuzzy logic to analyze the level of noise pollution and grade the annoyance caused to people in local facilities (such as hospitals, schools and markets) and residential areas located close to the airport. The analysis also takes into account the time of day and the distance at which the plane passes the area.
Fuzzy logic and soft computing
In 1991, Lofti A. Zadeh, a professor at the University of Berkeley and the founder of fuzzy logic, coined the term ‘soft computing’. This branch of artificial intelligence deals with the design of expert systems capable of managing inexact, uncertain and/or incomplete information. Fuzzy logic is one of the principal techniques in this field, together with evolutionary algorithms and neural networks. More than 2000 experts in soft computing from around the world will meet in Barcelona in July 2010 for the lEEE World Congress on Computational Intelligence, organized by the UPC researcher Pilar Sobrevilla.
Adapted from materials provided by Universitat Politècnica de Catalunya.

Sending Out Internet Warnings For Outages, Viruses


ScienceDaily (Mar. 17, 2009) — A long-overdue internet early warning system for Europe could help the region avoid deliberate or inadvertent outages, reduce the spread of new computer viruses, and ensure continuity of services.
Malte Hesse and Norbert Pohlmann of the Institute for Internet Security at the University of Applied Sciences Gelsenkirchen, Germany, point out that there is a growing need to improve the stability and trustworthiness of the internet, whether one is referring to web access, email, instant messaging and file transfer systems.
They add that raising awareness of critical processes and components on the internet among those responsible for their operation is essential. Moreover, there is a need to learn about internet use so that needs and service demands can be best catered for.
The internet is an incredibly diffuse network with no single, centralised control hub. Its complexity is not bounded by geographical, political, administrative or cultural borders, which means it presents an amazing challenge to the global society hoping to make best use of it and avoid criminal and terrorist activity that might occur online.
The internet's strength lies in this decentralised structure, but that also represents a problem in that it is not governed and consists of almost 30,000 autonomous systems each managed by individual organisations mostly within the private sector. The researchers obtained this figure using their AiconViewer tool developed by colleague Stefan Dierichs in 2006. Unfortunately, private organisations are exposed to a high level of competition, especially in times of recession, and this precludes the open exchange of important management information.
Nevertheless, if a European early warning system is to be built there is a need for a shift in attitude. "The cooperation of companies, organisations and governments is important to create a global view of the internet. By that we will be able to detect attacks in time and answer interesting research questions about the internet," the researchers say.
Early warning systems are present in various systems and are a crucial component of effective risk management in enterprises and for national homeland security systems. In order to create a European early warning system, funding has to be provided mainly by public sources in combination with income which can be generated through added value for the private partners, the researchers conclude.
Journal reference:
Malte Hesse and Norbert Pohlmann. European internet early warning system. International Journal of Electronic Security and Digital Forensics, 2009, 2, 1-17
Adapted from materials provided by Inderscience, via AlphaGalileo.

lunedì 16 marzo 2009

Breakthrough For Post-4G Communications


ScienceDaily (Mar. 14, 2009) — With much of the mobile world yet to migrate to 3G mobile communications, let alone 4G, European researchers are already working on a new technology able to deliver data wirelessly up to 12.5Gb/s.
The technology – known as ‘millimetre (mm)-wave’ or microwave photonics – has commercial applications not just in telecommunications (access and in-house networks) but also in instrumentation, radar, security, radio astronomy and other fields.
Despite the quantum leap in performance made possible by combining the latest radio and optics technologies to produce mm-wave components, it will probably only be a few years before there are real benefits for the average EU citizen.
This is thanks to research and development work being done by the EU-funded project IPHOBAC, which brings together partners from both academia and industry with the aim of developing a new class of components and systems for mm-wave applications.
The mm-wave band is the extremely high frequency part of the radio spectrum, from 30 to 300 gigahertz (GHz), and it gets it name from having a wavelength of one to 10mm. Until now, the band has been largely undeveloped, so the new technology makes available for exploitation more of the scarce and much-in-demand spectrum.
New products from Europe
IPHOBAC is not simply a ‘paper project’ where the technology is researched, but very much a practical exercise to develop and commercialise a new class of products with a ‘made in Europe’ label on them.
While several companies in Japan and the USA have been working on merging optical and radio frequency technologies, IPHOBAC is the world’s first fully integrated effort in the field, with a lot of different companies involved. This has resulted in the three-year project, which runs until end-2009, already having an impressive list of achievements to its name.
It recently unveiled a tiny component, a transmitter able to transmit a continuous signal not only through the entire mm-wave band but beyond. Its full range is 30 to 325GHz and even higher frequency operation is now under investigation. The first component worldwide able to deliver that range of performance, it will be used in both communications and radar systems. Other components developed by the project include 110GHz modulators, 110GHz photodetectors, 300GHz dual-mode lasers, 60GHz mode-locked lasers, and 60GHz transceivers.
Truly disruptive technology
Project coordinator Andreas Stöhr says millimetre-wave photonics is a truly disruptive technology for high frequency applications. “It offers unique capabilities such as ultra-wide tunability and low-phase noise which are not possible with competing technologies, such as electronics,” he says.
What this will mean in practical terms is not only ultra-fast wireless data transfer over telecommunications networks, but also a whole range of new applications (http://www.iphobac-survey.org).
One of these, a 60GHz Photonic Wireless System, was demonstrated at the ICT 2008 exhibition in Lyon and was voted into the Top Ten Best exhibits. The system allows wireless connectivity in full high definition (HD) between devices in the home, such as a set-top box, TV, PC, and mobile devices. It is the first home area network to demonstrate the speeds necessary for full wireless HD of up to 3Gb/s.
The system can also be used to provide multi-camera coverage of live events in HD. “There is no time to compress the signal as the director needs to see live feed from every camera to decide which picture to use, and ours is the only technology which can deliver fast enough data rates to transmit uncompressed HD video/audio signals,” says Stöhr.
The same technology has been demonstrated for access telecom networks and has delivered world record data rates of up to 12.5Gb/s over short- to medium-range wireless spans, or 1500 times the speed of upcoming 4G mobile networks.
One way in which the technology can be deployed in the relatively short term, according to Stöhr, is wirelessly supporting very fast broadband to remote areas. “You can have your fibre in the ground delivering 10Gb/s but we can deliver this by air to remote areas where there is no fibre or to bridge gaps in fibre networks,” he says.
Systems for outer space
The project is also developing systems for space applications, working with the European Space Agency. Stöhr said he could not reveal details as this has not yet been made public, save to say the systems will operate in the 100GHz band and are needed immediately.
There are various ongoing co-operation projects with industry to commercialise the components and systems, and some components are already at a pre-commercial stage and are being sold in limited numbers. There are also ongoing talks with some of the biggest names in telecommunications, including Siemens, Ericsson, Thales Communications and Malaysia Telecom.
“In just a few years time everybody will be able to see the results of the IPHOBAC project in telecommunications, in the home, in radio astronomy and in space. It is a completely new technology which will be used in many applications even medical ones where mm-wave devices to detect skin cancer are under investigation,” says Stöhr.
Adapted from materials provided by ICT Results.

'Map Of Science' Shows Scientists' Virtual Trails Through Online Services

ScienceDaily (Mar. 16, 2009) — Los Alamos National Laboratory scientists have produced the world's first Map of Science—a high-resolution graphic depiction of the virtual trails scientists leave behind when they retrieve information from online services.

The research, led by Johan Bollen, appeared recently in PLoS One. “This research will be a crucial component of future efforts to study and predict scientific innovation, as well novel methods to determine the true impact of articles and journals,” Bollen said.
While science is of tremendous societal importance, it is difficult to probe the often hidden world of scientific creativity. Most studies of scientific activity rely on citation data, which takes a while to become available because both the cited publication and the publication of a particular citation can take years to appear. In other words, citation data observes science as it existed years in the past, not the present.
Bollen and colleagues from LANL and the Santa Fe Institute collected usage-log data gathered from a variety of publishers, aggregators, and universities spanning a period from 2006 to 2008. Their collection totaled nearly 1 billion online information requests. Because scientists typically read articles online well before they can be cited in subsequent publications, usage data reveal scientific activity nearly in real-time. Moreover, because log data reflect the interactions of all users—such as authors, science practitioners, and the informed public—they do not merely reflect the activities of scholarly authors.
Whenever a scientist accesses a paper online from a publisher, aggregator, university, or similar publishing service, the action is recorded by the servers of these Web portals. The resulting usage data contains a detailed record of the sequences of articles that scientists download as they explore their present interests. After counting the number of times that scientists, across hundreds of millions of requests, download one article after another, the research team calculated the probability that an article or journal accessed by a scientist would be followed by a subsequent article or journal as part of the scientists’ online behavior. Based on such behavior, the researchers created a map that graphically portrays a network of connected articles and journals.
Bollen and colleagues were surprised by the map’s scope and detail. Whereas maps based on citations favor the natural sciences, the team’s maps of science showed a prominent and central position for the humanities and social sciences, which, in many places, acted like interdisciplinary bridges connecting various other scientific domains. Sections of the maps were shaped by the activities of practitioners who read the scientific literature but do not frequently publish in its journals.
The maps furthermore revealed unexpected relations between scientific domains that point to emerging relationships that are capturing the collective interest of the scientific community—for instance a connection between ecology and architecture.
“We were surprised by the fine-grained structure of scientific activity that emerges from our maps,” said Bollen.
According to Bollen, future work will focus on issues involved in the sustainable management of large-scale usage data, as well the production of models that explain the online behavior of scientists and how it relates to the emergence of scientific innovation. This information will help funding agencies, policy makers, and the public to better understand how best to tap the ebb and flow of scientific inquiry and discovery.
The research team includes Bollen, Herbert Van de Sompel, Ryan Chute, and Lyudmila Balakireva of LANL’s Digital Library Research and Prototyping Team and Aric Hagberg, Luis Bettencourt and Marko A Rodriguez of LANL’s Mathematical Modeling and Analysis Group, and LANL’s Center for Nonlinear Studies. Bettencourt also is part of the Santa Fe Institute.
Bollen and colleagues received funding from the Andrew W. Mellon foundation to examine the potential of large-scale usage data. The study is part of the MESUR (Metrics from Scholarly Usage of Resources) project of which Bollen is the principal investigator. The MESUR usage database is now considered the largest of its kind.
Adapted from materials provided by DOE/Los Alamos National Laboratory.


domenica 15 marzo 2009

Fighting Tomorrow's Hackers: Keeping Encryption Safe From Future Quantum Computers

ScienceDaily (Feb. 6, 2009) — One of the themes of Dan Brown’s The Da Vinci Code is the need to keep vital and sensitive information secure. Today, we take it for granted that most of our information is safe because it's encrypted. Every time we use a credit card, transfer money from our checking accounts -- or even chat on a cell phone -- our personal information is protected by a cryptographic system.
But the development of quantum computers threatens to shatter the security of current cryptographic systems used by businesses and banks around the world.
“We need to develop a new encryption system now, before our current systems -- such as RSA -- becomes instantly obsolete with the advent of the first quantum computer,” says Prof. Oded Regev at Tel Aviv University’s Blavatnik School of Computer Science. To accomplish that, Prof. Regev has proposed the first safe and efficient system believed to be secure against the massive computational power of quantum computers and backed by a mathematical proof of security.
Secure for Centuries
Prof. Regev stresses it is imperative that a new cryptographic system be developed and implemented as soon as possible. One reason is that current information, encrypted with RSA, could be retroactively hacked in the future, once quantum computers are available. That means that bank and other financial information, medical records, and even digital signatures could instantly become visible.
“You don’t want this information to remain secure for just 5 or 10 years until quantum computers are built,” says Prof. Regev. “You want it to be safe for the next century. We need to develop alternatives to RSA now, before it’s too late.”
New Cryptographic System
Cryptographic systems are used to transmit secure information such as bank and online transactions, and typically rely on the assumption that the factoring problem is difficult to solve. As a simplified example, if the number 3088433 were transmitted, an eavesdropper wouldn’t be able to tell that the number is derived from the factors 1583 and 1951. “Quantum computers can ‘magically’ break all of these factoring-based cryptographic systems, something that would take billions of years for current computers to accomplish,” Prof. Regev explains.
The current gold standard in encryption is the universally used RSA cryptosystem, which will be instantly broken once quantum computers are a reality -- an event predicted to happen as early as the next decade. To replace RSA in this new reality, Prof. Regev combined ideas from quantum computation with the research of other leaders in the field to create a system that is efficient enough to be practical for real-world applications.
Prof. Regev’s work was first announced in the ACM Symposium on Theory of Computing and will appear in the Journal of the Association for Computing Machinery. His work has now become the foundation for several other cryptographic systems developed by researchers from Stanford Research Institute, Stanford University, and MIT. Its potential real-world applications are extensive, ranging from banking transactions to eBay and other online auctions to digital signatures that can remain secure for centuries.
Adapted from materials provided by Tel Aviv University.

How Small Can Computers Get? Computing In A Molecule

SOURCE

ScienceDaily (Dec. 30, 2008) — Over the last 60 years, ever-smaller generations of transistors have driven exponential growth in computing power. Could molecules, each turned into miniscule computer components, trigger even greater growth in computing over the next 60?
Atomic-scale computing, in which computer processes are carried out in a single molecule or using a surface atomic-scale circuit, holds vast promise for the microelectronics industry. It allows computers to continue to increase in processing power through the development of components in the nano- and pico scale. In theory, atomic-scale computing could put computers more powerful than today’s supercomputers in everyone’s pocket.
“Atomic-scale computing researchers today are in much the same position as transistor inventors were before 1947. No one knows where this will lead,” says Christian Joachim of the French National Scientific Research Centre’s (CNRS) Centre for Material Elaboration & Structural Studies (CEMES) in Toulouse, France.
Joachim, the head of the CEMES Nanoscience and Picotechnology Group (GNS), is currently coordinating a team of researchers from 15 academic and industrial research institutes in Europe whose groundbreaking work on developing a molecular replacement for transistors has brought the vision of atomic-scale computing a step closer to reality. Their efforts, a continuation of work that began in the 1990s, are today being funded by the European Union in the Pico-Inside project.
In a conventional microprocessor – the “motor” of a modern computer – transistors are the essential building blocks of digital circuits, creating logic gates that process true or false signals. A few transistors are needed to create a single logic gate and modern microprocessors contain billions of them, each measuring around 100 nanometres.
Transistors have continued to shrink in size since Intel co-founder Gordon E. Moore famously predicted in 1965 that the number that can be placed on a processor would double roughly every two years. But there will inevitably come a time when the laws of quantum physics prevent any further shrinkage using conventional methods. That is where atomic-scale computing comes into play with a fundamentally different approach to the problem.
“Nanotechnology is about taking something and shrinking it to its smallest possible scale. It’s a top-down approach,” Joachim says. He and the Pico-Inside team are turning that upside down, starting from the atom, the molecule, and exploring if such a tiny bit of matter can be a logic gate, memory source, or more. “It is a bottom-up or, as we call it, 'bottom-bottom' approach because we do not want to reach the material scale,” he explains.
Joachim’s team has focused on taking one individual molecule and building up computer components, with the ultimate goal of hosting a logic gate in a single molecule.
How many atoms to build a computer?
“The question we have asked ourselves is how many atoms does it take to build a computer?” Joachim says. “That is something we cannot answer at present, but we are getting a better idea about it.”
The team has managed to design a simple logic gate with 30 atoms that perform the same task as 14 transistors, while also exploring the architecture, technology and chemistry needed to achieve computing inside a single molecule and to interconnect molecules.
They are focusing on two architectures: one that mimics the classical design of a logic gate but in atomic form, including nodes, loops, meshes etc., and another, more complex, process that relies on changes to the molecule’s conformation to carry out the logic gate inputs and quantum mechanics to perform the computation.
The logic gates are interconnected using scanning-tunnelling microscopes and atomic-force microscopes – devices that can measure and move individual atoms with resolutions down to 1/100 of a nanometre (that is one hundred millionth of a millimetre!). As a side project, partly for fun but partly to stimulate new lines of research, Joachim and his team have used the technique to build tiny nano-machines, such as wheels, gears, motors and nano-vehicles each consisting of a single molecule.
“Put logic gates on it and it could decide where to go,” Joachim notes, pointing to what would be one of the world’s first implementations of atomic-scale robotics.
The importance of the Pico-Inside team’s work has been widely recognised in the scientific community, though Joachim cautions that it is still very much fundamental research. It will be some time before commercial applications emerge from it. However, emerge they all but certainly will.
“Microelectronics needs us if logic gates – and as a consequence microprocessors – are to continue to get smaller,” Joachim says.
The Pico-Inside researchers, who received funding under the ICT strand of the EU’s Sixth Framework Programme, are currently drafting a roadmap to ensure computing power continues to increase in the future.
Adapted from materials provided by ICT Results.

Quantum Computing Closer To Reality As Mathematicians Chase Key Breakthrough


ScienceDaily (Dec. 23, 2008) — The ability to exploit the extraordinary properties of quantum mechanics in novel applications, such as a new generation of super-fast computers, has come closer following recent progress with some of the remaining underlying mathematical problems. In particular, the operator theory used to describe interactions between particles at atomic scales or smaller where quantum mechanical properties are significant needs to be enhanced to deal with systems where digital information is processed or transmitted.

In essence, the theory involves mathematical analysis based on Hilbert Spaces, which are extensions of the conventional three dimensional Euclidean geometry to cope with additional dimensions, as are required to describe quantum systems.
These challenges in mathematical analysis and prospects for imminent progress were discussed at a recent conference on operator theory and analysis organised by the European Science Foundation (ESF) in collaboration with the European Mathematical Society and the Mathematical Research and Conference Center in Bedlewo, Poland. The conference brought together some of the world's leading mathematical physicists and quantum mechanics specialists to tackle the key fields relating to spectral theory, according to the conference's co-chair Pavel Kurasov from the Lund Institute of Technology in Sweden. Among the participants were Uzy Smilansky, one of the leading authorities on quantum chaos, from the Weizmann Institute of Technology in Israel, and Vladimir Peller, specialist in pure mathematical analysis at Michigan State University in the US.
As Kurasov pointed out, a big challenge lies in extending current operator theory to describe and analyse quantum transport in wires, as will be needed for a new generation of quantum computers. Such computers will allow some calculations to be executed much more quickly in parallel by exploiting quantum coherence, whereby a processing element can represent digital bits in multiple states at once. There is also the prospect of exploiting another quantum mechanical property, quantum entanglement, for quantum cryptography where encryption key information can be transmitted with the ability to detect any attempt at tampering or eavesdropping, facilitating totally secure communication. In fact quantum cryptography has already been demonstrated over real telecommunications links and will be one of the first commercial applications based exclusively on quantum mechanics.
The operator theory required for quantum information processing and transmission is already well developed for what are known as self-adjoint operators, which are used to describe the different quantum states of an ideal system, but cannot be used for systems like a communications network where dissipation occurs. "So far only self-adjoint models have been considered, but in order to describe systems with dissipation even non-self-adjoint operators should be used," said Kurasov. The aim set out at the ESF conference was to extend the theory to non self-adjoint operators, which can be used to analyse real systems. "These operators may be used to describe quantum transport in wires and waveguides and therefore will be used in design of the new generation of computers," said Kurasov."Physicists are doing experiments with such structures, but the theory is not developed yet. An important question here is fitting of the parameters so that models will describe effects that may be observed in experiments." This question was discussed during inspiring lecture by Boris Pavlov from Auckland University, New Zealand – world leading specialist in mathematical analysis who became interested in physical applications.
Intriguingly Kurasov hinted that a breakthrough was likely before the next ESF conference on the subject in two years time, on the problem of reconstructing the so called quantum graphs used to represent states and interactions of quantum systems from actual observations. This will play a vital role in constructing the intermediate components of a quantum computer needed to monitor its own state and provide output.
Kurasov noted that this ESF conference was one in a series on the operator analysis field organized every second year, with proceedings published regularly in a book series Operator Theory: Advances and Applications.
The ESF conference Operator Theory, Analysis and Mathematical Physics was held at the Mathematical Research and Conference Center, Będlewo in Poland in June 2008
Adapted from materials provided by European Science Foundation.

sabato 14 marzo 2009

Tracking Tigers In 3-D

ScienceDaily (Mar. 12, 2009) — New software developed with help from the Wildlife Conservation Society will allow tiger researchers to rapidly identify individual animals by creating a three-dimensional model using photos taken by remote cameras. The software, described in an issue of the journal Biology Letters, may also help identify the origin of tigers from confiscated skins.
The new software, developed by Conservation Research Ltd., creates a 3D model from scanned photos using algorithms similar to fingerprint-matching software used by criminologists.
The study's authors include Lex Hiby of Conservation Research Ltd., Phil Lovell of the Gatty Marine Laboratory's Sea Mammal Research Unit, and Narendra Patil, N. Samba Kumar, Arjun N. Gopalaswamy and K. Ullas Karanth all of the Wildlife Conservation Society's India Program.
Researchers currently calculate tiger populations by painstakingly reviewing hundreds of photos of animals caught by camera "traps" and then matching their individual stripe patterns, which are unique to each animal. Using a formula developed by renowned tiger expert Ullas Karanth of WCS, researchers accurately estimate local populations by how many times individual tigers are "recaptured" by the camera trap technique.
It is expected that the new software will allow researchers to rapidly identify animals, which in turn could speed up tiger conservation efforts.
"This new software will make it much easier for conservationists to identify individual tigers and estimate populations," said Ullas Karanth, Senior Conservation Scientist at the Wildlife Conservation Society and one of the study's co-authors. "The fundamentals of tiger conservation are knowing how many tigers live in a study area before you can start to measure success."
The study's authors found that the software, which can be downloaded for free at: http://www.conservationresearch.co.uk, was up to 95 percent accurate in matching tigers from scanned photos. Researches were also able to use the software to identify the origin of confiscated tiger skins based on solely on photos. Development of the software was funded through a Panthera project in collaboration with WCS.
Facilities for obtaining the images used for the construction of the three-dimensional surface model were provided by the Thrigby Hall Zoo, Norfolk, England. Centre for Wildlife Studies, Bangalore and the Wildlife Conservation Society, India Program provided images, local resources and staff time for this study, which was supported in part by a grant from the Liz Claiborne / Art Ortenberg Foundation.
Adapted from materials provided by Wildlife Conservation Society, via EurekAlert!, a service of AAAS.

Random Network Connectivity Can Be Delayed, But With Explosive Results, New Study Finds

ScienceDaily (Mar. 12, 2009) — In the life of many successful networks, the connections between elements increase over time. As connections are added, there comes a critical moment when the network's overall connectivity rises rapidly with each new link.
Now a trio of mathematicians studying networks in which the formation of connections is governed by random processes, has provided new evidence that super-connectivity can be appreciably delayed. But the delay comes at a cost: when it finally happens, the transition is virtually instantaneous, like a film of water abruptly crystallizing into ice.
The team's findings — described in a paper with an accompanying commentary in the March 13 issue of the journal Science —could be useful in a number of fields: from efforts by epidemiologists to control the spread of disease, to communications experts developing new products.
"We have found that by making a small change in the rules governing the formation of a network, we can greatly manipulate the onset of large-scale connectivity," said Raissa D'Souza, an associate professor of mechanical and aeronautical engineering at UC Davis.
In the classic model of random network formation, known as the Erdös-Rényi model, connections are added from among a large collection of points one at a time by randomly selecting a pair of points to connect. Two points are considered to be in the same group if it is possible to go from one to another along a continuous line of connections. A group remains very small until the number of connections reaches at least half the number of points. After that, the growth of the largest group follows a steep upward curve.
D'Souza, along with co-investigators Dimitris Achlioptas at UC Santa Cruz and Joel Spencer at New York University, wanted to explore how a network would change if there were an element of choice injected into its formation. In their mathematical model, they considered two random connections in each step, and selected only one. To make their choice, they multiplied the number of points in the group linked to one end of a connection by the number of points linked to its other end. And in each case, they chose the connection that yielded the lower product.
As they expected, this process delayed the onset of super-connectivity. But the team's analysis provided strong evidence for a new phenomenon: when a system is suppressed like this, it builds up a kind of pressure. "This algorithm yields a very violent transition," Achlioptas said, "reaching a critical moment at which the probability that two points are connected jumps from essentially zero to more than 50 percent instantaneously."
Their calculations for this model have provided important insights that could be broadly applicable to understanding and influencing the behavior of various kinds of networks, D'Souza said, adding that the work should also spark a quest for a mathematical proof to back their findings, an endeavor that may require new mathematics.
"Consider this," she said. "Often we are presented with two alternatives, and must choose one. We have no control over which alternatives are presented, but we certainly can control what we choose."
Adapted from materials provided by University of California - Davis, via EurekAlert!, a service of AAAS.