Research

Bridging the Gap Between Computers and the Human Mind

Memristors is the name given to the memory resistors that scientists are trying to develop in order to make computers operate more like the human brain does. With the help of a recent technological advancement, it is hoped that computers will one day soon be able to perform logical operations and retain memories much like people do. What role will memristors play?

Operating in a manner more like neurons than flash drives means human-like speed for processing, and no more wait times while booting up. Also, where RAM is flawed in that it loses the information it previously held when power is lost, memristors will always remember their condition at the time of the power supply being cut off. These special resistors are also unaffected by radiation and require far less power to run than standard computing resources. What is more, they are extremely unlikely to crash. So what is the catch?

Memristors are a two-terminal form of electronics. That means you have to change the voltage that goes through them eternally in order to tune them. A new team that is working on the project, however, may have fixed this issue by developing three-terminal memristors. Adding a third electrode adds a means for controlling the resistance in a much more manageable way. How did they make this adjustment?

The key technological advancement was a unique semiconductor. A nanomaterial named molybdenum is used as a sort of interface for the memristors by altering the current flow. This published discovery is thought to be the cornerstone of the future of computing.

Will computers one day soon operate much the way our brains do? Is this the direction of our technological advancement? Time will tell if memristors will shape the future much like Alan Turing’s logic machine did back during World War 2.

Quantum Computing Power

Quantum Computing Power: One Step Closer

What is the future of super high power, decision-making computers? It’s quantum computing, of course. But developers have been perplexed for years now about how to get some of the details to work. For example, one problem has been accurately processing the sheer amount of data involved with quantum computing power. Another issue involves storing the quantum information for a period of over half a minute. Two teams of researchers from Australia and Wales have overcome these two issues and the world is one step closer to quantum computing.

Data in a regular computer is referred to as a bit (either a 1 or a 0). While quantum units of data have the same 1 or 0 designation, they are given the name qubits. Again, one major issue with quantum computing has been the fragility of qubits. Data that dissolves in under 30 seconds leads to miscalculations which may start small but end up making the quantum algorithms ultimately unusable. That’s why overcoming the fragile state of the qubits and increasing the accuracy of calculations have been a main focus of researchers.

The two teams used silicon to develop qubits that mastered the limitations of previous qubits, which were developed from phosphorus. A silicon-28 isotope suffers no effect from magnetic fields, allowing it to be used in an electronic device (like other items made from silicon). A major advantage with the silicon-28 isotope is it dropped the margin of error to a remarkable 0.01 percent. This means that only 1 out of every 10,000 operations failed to produce an accurate result.

Moreover, the data remained intact for over half a minute. In the world of quantum computing, that may as well be forever, since things happen in microseconds. Now the teams are working on increasing quantum computing power even further by developing super precise qubits in tangled pairs.

Eco Friendly Windows

Eco Friendly Windows: Transparent Solar Collectors

First of all, let’s make it clear that we are talking about the actual windows in your home or office and not the Windows operating system. Second, by green, we mean eco friendly windows. How are researchers working on windows that can help reduce energy consumption and costs? What is this potentially world shaping new idea?

Researchers have been spending a lot of time developing better solar panels, but some ingenious developers in Michigan got together and came up with the idea for transparent solar collectors. This would allow for conventional windows to be replaced with green energy collectors.

Of course, this isn’t the first time someone has come up with the idea. The Michigan researchers are just the first ones to make the idea feasible. Previous versions of transparent solar collectors were scrapped, because they simply didn’t collect and output enough energy to make them worth the cost. The other issue was actually making them transparent. Some were tinted so dark that they were hardly windows anymore. Others required a tinting that was colored, severely limiting their market since most homeowners and shopkeepers will opt for clear glass over colored. After all, when light comes through colored glass, it changes the entire appearance of the room, tinting everything a hue of the glass color.

So, what is the ‘secret ingredient’ that solves these problems? The transparent collectors work by absorbing light that is already invisible to the human eye, so no particular colors are lost. This is done using organic molecules so tiny that the glass still appears transparent despite their existence.

While you aren’t going to be seeing these clear solar energy collectors in homes or offices quite yet, the technology works and is being tweaked to prepare it for eventual mass production. Imagine a home where all of the ‘windows’ actually collected energy from the sun to help power the home. Imagine offices going green by replacing current windows with these eco friendly windows. This is really a huge advancement in the battle against global warming.

Gigabit Wireless

Gigabit Wireless – Is This the Wireless of the Future?

Wireless networks are constantly requiring the ability to handle more and more data. Two papers about gigabit wireless, both addressing this problem, were recently given at a convention for wireless communications. One of the papers suggested that the millimeter-wave band could be the answer to the problem. This paper noted that the millimeter-wave band, or MMW, could increase the wireless abilities of smart phones and tablets, at least in theory. The researchers were interested in technologies that make the data capacity larger for wireless networks. They found that polarimetric filtering creates a greater density of links between data. The research suggested that each link using the MMW could reach more than 6Gbps and that there could be more than one link at a time in a room. This is much better performance than current technology allows.

One of the students working on the project created a video to illustrate what this may be able to accomplish. Every year, the demand for data doubles, and it does not seem likely that this demand will decline any time soon. This is forcing service providers to make their networks denser. Moreover, 3G and 4G networks are reaching their limits, another factor that is causing developers to consider MMW for 5G networks.

The other research talked about beamforming, or spatial filtering, being a possible problem solver for connections between 4G and 5G stations and networks. It also spoke positively of connections that go directly to the user. This would include focusing the waveforms directly onto the tablets and phones. Until now, the connection to the core network has limited the data the cell network can handle. The research proposed an algorithm to lengthen the data rate while, at the same time, reducing possible interference. The researchers had some success in doing this, and the future of this technology is bright.

These papers reflect the ongoing efforts to allow gigabit wireless the ability to handle the increasing need for data.

25 Years of the Internet

25 Years of the Internet: What Would We Do Without You?

Nowadays, with the Internet such a central and necessary part of our lives, it is hard to remember a time before it existed. We owe much to Tim Berners-Lee, who in 1989 while working in Switzerland, came up with what originally was just an idea for a way for scientists to share data. Of course, with over 25 years of the Internet now an historical fact, it has become a mature system and a way for just about everyone to share just about everything.

Considering our inability to now function without it, it is surprising that at the time the development of the Internet did not seem like a major advancement. Computers had already been around for a while, and were in the process of becoming more common. In fact, since the 1940s, scientists had been trying to figure out how to get computers to “talk” to each other. Then, starting in 1969, researchers began to develop the Internet itself. Unfortunately, in the late 1980s most people did not have a computer. Unless you had a job that made it useful, as in accounting, it probably would have just sat in a corner. This was especially true since there was no way for one computer to communicate with another.

That all changed in 1989 when Tim Berners-Lee proposed a way for the information to be managed and shared between colleagues at his company. Tim’s boss did not entirely understand, but he allowed Tim to continue working on the idea anyway.

The basic idea was actually not very complex. It had to do with connecting information to the Internet using three different technologies. The first was what we know as a URI or a URL, which is a specific identifier for a resource on the Internet. The second was the HTML or the HyperText Markup Language that created pages. The third technology was the HyperText Transfer Protocol or HTTP, which changes hypertext. Before Berners-Lee there were similar systems, but his process really allowed for servers to be built easily. He and his team created the first web page writer, the first server, and the very first web page. Two years later, they took it outside their company and gave it to the rest of the world for free.

At first, though, the web was not really anything exciting. There actually were some other ways of gaining access to the Internet, but there wasn’t much to do because most networks were closed. It was not owned by anyone and could be tailored to fit specific purposes. What contributed greatly to the explosion of users was the Mosaic browser. The rest, as they say, is history. Now, with 25 years of the Internet behind us, the Internet has expanded to what we know it as today, a system that affects almost every facet of everyday life.

How Science Is Working to Keep Cell Towers from Overloading

It’s a well-known problem: When disaster strikes, it’s tough to get in touch with loved ones and make sure they’re okay. And ironically enough, the disaster itself doesn’t have to damage a network’s cell towers to cause this issue; cell towers simply can’t handle the traffic of everyone in the area trying to connect all at once. Text messages may not go through, and a call that actually does may get dropped after only few seconds. We rely on connectivity 24/7 in this modern age, making experts work more rapidly for solutions that stop and even prevent cell towers from overloading in crucial situations.

According to the progress made with research, the secret will eventually lie in utilizing radio and television signals. It also requires the use of phones with “smart antennas.” Some phones possess multiple types of antennas, mainly because various devices utilize different means of communication, including cellular, Wi-Fi, Bluetooth, near field communication and other technologies. Unfortunately, these signals can sometimes undermine one another. Perhaps you’ve noticed your smartphone dropping calls when someone else’s flip phone is working just fine. Of course, these crossed signals can work the opposite way too, with you sometimes finding yourself as being the only person with reception.

The secret is to use the second instance to the advantage of the cell network. Rather than having calls dropped because of too many phones trying to connect at once, the idea is to use different types of connections to boost power. The cellular industry had to be fine-tuned to ensure that cell signals didn’t interfere with TV and radio channels. But what if, in an emergency situation, there was a failsafe that allowed mobile users to interfere with other airwaves to make that all-important phone call? This may be the future of mobile usage—and even data connectivity—as we know it.

Long Term Digital Storage

Need to store your data for more than 10 years? Right now that means countless backups and transferring data to new locations, as digital storage has, more or less, a one-decade shelf life. Researchers, however, have discovered a solution to the long-term data storage problem, and data hoarders everywhere have rejoiced.

A 10-year shelf life isn’t the only problem plaguing data storage, in actuality. A mechanical failure (like a head crash) can result in the loss of data, and magnetic fields also pose a threat. Several factors have thus led to more research on the right combination of materials for more permanent storage solutions. Will their new storage method retain data for a hundred years? A thousand? Actually, the number is more in the millions.

Due to its exceptional heat resistance, tungsten was the chosen metal for the studies—the biggest threat to all digital technology is heat, after all. Silicon nitride was also used to increase heat resistance and protect against physical damage. Surprisingly enough, scientists believe that not only can the data contained on such digital storage devices outlast the human race itself, but that they could actually still be available and intact for whatever dominant intelligent species comes next on earth. Regardless of how ridiculous that sounds, it’s still an impressive feat of technology.

How does data get stored on tungsten? It’s done with tiny QR codes like the ones you see on those subway and magazine ads—only on a microscopic scale. A test was conducted to determine how long the data could be stored for, with digital storage being exposed to heat for certain lengths of time. The exposure simulated specific time periods: For example, to simulate a million years of storage, the test exposed the data to temperatures of 400 degrees Fahrenheit for a full hour. The study showed that the data survived undamaged enough to still be read.

While the test isn’t foolproof, it does show that data can survive millions of years of normal heat conditions, making daily wear and tear not an issue and keeping our precious files intact.

Go to Top