How does computer memory work




















All of the components in your computer, such as the CPU, the hard drive and the operating system , work together as a team, and memory is one of the most essential parts of this team.

From the moment you turn your computer on until the time you shut it down, your CPU is constantly using memory. Let's take a look at a typical scenario:. This simply means that it has been put in the computer's temporary storage area so that the CPU can access that information more easily.

When an application is closed, it and any accompanying files are usually purged deleted from RAM to make room for new data. If the changed files are not saved to a permanent storage device before being purged, they are lost.

One common question about desktop computers that comes up all the time is, "Why does a computer need so many memory systems? Sign up for our Newsletter! Mobile Newsletter banner close. Mobile Newsletter chat close. Mobile Newsletter chat dots. Mobile Newsletter chat avatar. Mobile Newsletter chat subscribe. Even when you use flash, you still have to go through the same process of reading and writing from a disk.

Flash accesses the information faster than disk, but it still uses the same slow process to get the data to the processor. Moreover, because of the inherent limitation in flash's physical design, it has a finite number of reads and writes before it needs to be replaced. Modern RAM, on the other hand, has unlimited life and takes up less space than flash.

Flash may be five to ten times faster than a standard disk drive but RAM is up to a million times faster than the disk. Combined with the other benefits, there's no comparison.

RAM handles the speed of in-memory computing. But the scalability of the technology comes from parallelization. Parallelization came about in the early s to solve a different problem: the inadequacy of bit processors.

By , most servers had switched to bit processors which can handle a lot more data. But in , bit processors were common and they were very limited. They couldn't manage more than four gigabytes of RAM memory at a time. Even if you put more RAM on the computer, the bit processor couldn't see it. But the demand for more RAM storage was growing anyway.

The solution was to put data into RAM across a lot of different computers. Once it was broken down like this, a processor could address it. The cluster of computers looked like it was one application running on one computer with lots of RAM. You split up the data and the tasks, you use the collective RAM for storage, and you use all the computers for processing. That was how you handled a heavy load in the bit world and it was called parallelization or massively parallel processing MPP.

When bit processors were released, they could handle more or less an unlimited amount of RAM. Parallelization was no longer necessary for its original use.

But in-memory computing saw a different way to take advantage of it: scalability. Even though bit processors could handle a lot more data, it was still impossible for a single computer to support a billion users. But when you distributed the processing load across many computers, that kind of support was possible. Better, if the number of users increased, all you had to do was add a few more computers to grow with them.

Picture a row of six computers. You could have thousands of computers but we'll use six for this example. These computers are connected through a network, so we call them a cluster. Now imagine you have an application that will draw a lot of traffic, too much traffic to store all of the data on one computer. With parallelization, you take your application and break its data into pieces.

Then you put one piece of it in computer 1, another piece in computer 2, and so on until the data is distributed optimally across the cluster. Your single application runs on the whole cluster of computers. When the cluster gets a request for data, it knows where that data is and processes the information in RAM from there. RAM allows your computer to perform many of its everyday tasks, such as loading applications, browsing the internet, editing a spreadsheet, or experiencing the latest game.

Memory also allows you to switch quickly among these tasks, remembering where you are in one task when you switch to another task. As a rule, the more memory you have, the better. Memory is used to load and run applications, such as your spreadsheet program, respond to commands, such as any edits you made in the spreadsheet, or toggle between multiple programs, such as when you left the spreadsheet to check email.

Memory is almost always being actively used by your computer. If your system is slow or unresponsive, you may need a memory upgrade. In a way, memory is like your desk. It allows you to work on a variety of projects, and the larger your desk, the more papers, folders, and tasks you can have out at one time.

You can quickly and easily access the information without going to a filing cabinet your storage drive. Your storage drive hard drive or solid state drive is the filing cabinet that works with your desk to track your projects. RAM is used to store information that needs to be used quickly.

This means that opening many programs, running various processes or accessing multiple files simultaneously is likely to use a lot of RAM. Particularly complexed programs like games or design software will use most RAM. Whether you are a gamer , designer , or just looking to speed up your personal computer, upgrading RAM is a simple and easy way to boost your system performance.



0コメント

  • 1000 / 1000