SIDEBAR
»
S
I
D
E
B
A
R
«
The Evolution of the Disk Drive
Apr 10th, 2015 by aperio

Technology is constantly evolving and so is the disk drive. That small device you rely on so heavily on a daily basis as part of your computer system or server has evolved from a tape to a small device you can store mountains of information on daily.

When computers were first introduced, they couldn’t hold that much storage. In fact before computers, everything was done manually and once computers were introduced, all disk drives were external, providing companies with the ability to store all their data on external devices, which were carefully locked away in a safe and taken out each day to be used.

One of the first disk drives, which many people don’t even know existed, is the magnetic tape. The magnetic tape enables computer operators to store high volumes of data. The reel held a long string of tape which could accommodate up to ten thousands punch cards worth of data. In those days it was a lot, today it probably wouldn’t even cover you for a complete days’ worth of work.

Then came the floppy disk. The floppy disk was introduced in the 1970’s and could hold a high volume of data. It could hold in the region of four hundred and eighty kilobytes, which was much higher than the magnetic tapes in a smaller and more compact design. This enables computer operators to share information by swapping disk drives from computer to computer.

In the 1980’s a smaller floppy disk drive was introduced. These were made of hard plastic and were only 3.5 inches in size. Computers only just started being manufactured without this disk drive in place a few years back, these floppies remained in computers for three decades and are still used by some users today.

The next introduction to computers is the CD rom, or compact disc drive, an exciting introduction which improved data storage and speed. Operators were able to save a document to the CD within seconds, rather than waiting minutes on a magnetic tape or floppy disk. CD’s were also able to hold more data and was thin and much easier to carry around. They were used extensively in gaming, enabling game developers to sell their games to computer users with easy installation instructions.

Next came the DVD drive, which many people still use today. The DVD drive sped up the saving process and could save large volumes of information. Another benefit is that many DVD’s are rewritable, saving companies money by not having to replace the DVD’s every time they want to do a backup or save a file.

SD memory cards were introduced in 2000. These were small and encrypted cards available in thirty two and sixty four megabyte sizes all the way up to two terabytes. They are still used extensively today in smart phones, cameras and tablets.

Most companies rely on external disk drives these days, reducing how much they have to store on their computers. External disk drives also enable people to save data and then take the disk drive when them and plug it into another computer to access the information.

Internal disk drives are used extensively, especially in servers. Companies that rely on servers will keep a number of top disk drives, which are used to save all their important data and information. From here it is accessible to all the computers in the office, reducing how much is saved on each individual computer. This also makes backup easier and quicker, saving all the information from one point.

Are you playing it safe when it comes to the cloud?
Feb 5th, 2015 by aperio

Yes, it’s all going to the cloud, which is better than “to the dogs.” And yes, you have to make sure your cloud environment is secure.

You need to confront some hard realities about cloud security because the cyber landscape continues to be unforgiving. It doesn’t matter whether you’re protecting traditional computer systems, your mobile platform or the cloud itself. Simply put, organized cyber crime and cyber espionage continue to grow in sophistication. Any new hackable platform is red meat for them. Opening massive breaches that harvest critical data is their day and night job. News headlines make that clear that the aggregate total of global cyber crime damage now rivals that of many nations’ annual gross domestic product (GDP).

First reality: Organizations spend considerable time and money securing their on-premises infrastructure. That’s good. The problem is maintaining that same high level of security when outsourcing to the cloud. Security delivery requires a cloud provider’s undivided attention. Yes, there are built-in security tools, but you will not get the key to any strong security posture—24/7/365 threat monitoring, analysis and response—or “managed security service.” These are humans watching out for you. You must know what’s happening on the cloud in real time and be able to respond very quickly. You need people to manage this, even if you have automated capabilities as part of your cloud security. The “cloud” doesn’t do it on its own.

(Related: An interview with Brendan Hannigan, IBM GM Security Systems Division)

Second reality: Repeat after me: “My cloud will be breached.” Take a deep breath. Say it one more time.

Remember, just because you’ve been breached doesn’t mean an attacker knows where to go once they get in your system. If you identify the attack quickly you can prevent him or her from getting to your critical data.

So, review your incident response plan for cloud security. What, you don’t have one? Okay, review the plan you have for your premises infrastructure.

If you still have a blank look, gather your team and start putting a response plan together—fast. How you handle it is crucial, particularly the speed of your response. Sophisticated attacks often show no upfront “symptoms” but can quietly devastate your business over time. The longer it takes to resolve an attack, the more costly it becomes.

Prevention starts with an incident-response plan and mock exercises to test the plan. Get an experienced provider to try and hack your cloud. Find out your vulnerabilities. Most important, make sure you have a team ready to move quickly and decisively if you suspect your cloud has been attacked.

Third reality: Last but maybe most importantly, get smart about “security intelligence.” Your cloud systems, along with your other IT platforms, generate billions of security events each day from firewalls, emails, servers and the like. It’s simply not possible to manually sift through this data and find evidence of suspicious behavior. Beyond the costs involved, it’s confined to figuring out “what happened” rather than “what will occur.”

When applied to security data, big-data analytics tools can be transformative—the tip of the spear in security intelligence and response. Analytics can provide automated, real-time intelligence and situational awareness about your infrastructure’s state of security to help disrupt the attack chain.

Say that two similar security incidents take place, one in Brazil, the other in Pittsburgh. They may be related. But without the intelligence needed to link them, an important pattern—one that could indicate a potential incident—may go unnoticed.

You need this capability, and providers like IBM are stepping up to make it the ultimate reality.

Stay safe.

TIPS TO EASILY EMERGE INTO THE NEW YEAR
Dec 22nd, 2014 by aperio

We are all heading somewhere.

Whether it’s to visit family for the holidays, a trip to the grocery store or being on the path to enhance your life, one thing necessary to arrive at your destination is learning how to change lanes effectively.

I could speak to you about literally changing lanes – using your blinkers, mirrors and gracefully merging into the gap that awaits you. That will surely get you to your destination safely and on time.

Let’s move this conversation to metaphorically changing lanes in your life.

With the New Year right around the corner, reflecting on 2014 will inform you of where you want to go in 2015.

What lane do you want to be in 2015?

The fast lane, the slow lane or somewhere in the middle?

For those of you that have been following me for a minute, you know I choose the fast lane. However, I’ve had a few signals lately, partnered with internal nudges, to move to a different lane.

How the heck do you do that when you’re used to moving at 100 mph and you have got places to go?

First, ease your foot off the gas. As I ponder what’s most important these days, I reconnect with my personal values, the non-negotiables that have to be present for me to say YES to something. If the activity is not aligned with those values, I say no, or at least not now. One of my values is to have fun with everything I do, so if it is not fun, I either look at how to make it fun or delegate the task to someone that finds it fun.

As I slow down, I enjoy the beauty around me. This lets me pick and choose the things I want to fill my calendar with. The second step to effectively changing lanes is to surround yourself with people that are spending time in the lane you want to be in.

Jim Rohn, author, entrepreneur and motivational speaker said, we are most like the 5 people we spend the most time with. Make a list of the people you are with the most and honestly ask yourself if they are supporting you in getting to your chosen destination. If not, start to put your attention on the type of people you desire and begin to attract them into your life.

Last but not least, have a clear picture of where you are going. I-285 is a highway that’s a big circle, outlining Atlanta. If you’re not sure where you are heading, it feels like you are going in circles – literally. Choose what you want to achieve this month and pick a lane, any lane. Committing to it will help you arrive there with more ease.

If you want to enjoy the holidays like you never have before, make sure your activities are aligned with what is most important to you, weave in the people you want to spend time with and have a clear picture of what you want to accomplish-even if accomplishment means more naps during the holiday break.

What lane will you choose to drive you forward in creating a phenomenal New Year?

Article Source: http://EzineArticles.com/8853896

Photo Source: Viktor Hanacek
Increase Computer Performance By Defragging
Dec 9th, 2014 by aperio

It becomes easy to decide to replace a computer when it no longer runs the same way as the first time it was used. With developments and upgrades constantly appearing in the market, some people simply do not bother doing maintenance on their computers and simply dismiss slow computers as “past their prime” and immediately look for a better model.

Although buying a new computer can instantly solve an aged computer, this option is not available to people on a budget, but this does not mean they have to put up with long boot times, blue screens and instant shutdowns. A simple process known as defragging can improve performance and somewhat postpone decisions to buy another unit.

Defragmenting is known as the process of reversing the fragmentation of files on a hard drive. Fragmentation occurs with prolonged use and poor maintenance. This is when PCs tend to end up with plenty of files scattered across the free spaces within their memory, thus causing slower process executions and file opening and other bugs and errors. Defragmenting counteracts these issues, and in turn restores efficiency to the computer in several ways:

1. Faster Boot Times -This phenomenon occurs when the startup system takes too long to find certain files that are needed when the computer is started. These are known as boot files. Defragmenting organizes these files into a cluster and makes it easier for the computer to find and access them. The faster the processor finds the boot files, the faster the starting time.

2. Less “DLL, SYS and EXE” errors – The most common error associated with these file types are the ones wherein the computer cannot find them. This could be due to the possibility that these files could be hidden in inappropriate folders or duplicated in several locations. A good example will be.exe files. Sometimes, applications and programs take too long to open or do not open at all because the.exe file is missing. Defragmenting sorts out the files on the computer and allows the computer to access these files faster.

3. Discover problem areas in the hard drive – After defragmentation, the system provides a report of the changes that were made during the process. It will also report what areas it could not defragment due to corrupted files. These broken files take up space on the drive and may even affect processing performance just by simply being there. With this information, a computer owner can look at the program files for that specific area. The owner can then get rid of the problematic areas.

4. Less Effort on the Hardware – With easier to locate files, the internal workings of the hard drive do not have to go such lengths to reach and access the data they need to. This means a reduced exposure to wear and tear on account of exerting less effort and resources to complete certain actions. This immediately adds more time to the lifespan of your hard drive and in turn, the whole computer.

5. Tighter Security – With defragmented files, the efficiency of anti-virus programs increases as well. These applications take less time to scan areas of interest on the hard drive. It also allows for a higher chance of isolating and deleting viruses before the integrity of other necessary files and data is compromised. Detecting these problems also becomes less of a task because an unwanted virus is sure to stand out after defragmentation. The system is trained to sort files that it normally uses. A foreign element such as a virus that has no specific classification under the defragmentation will show up as unmoved.

These consequences make disk defragmentation a necessary step in making sure any desktop computer lasts a considerable amount of time. What makes defragmenting even more ideal is the fact that it is simply another command that is given to the computer. Windows Operating Systems allows users to start a defragmentation under the System Tools section found within the Accessories Menu.

A simple click will prompt the process. Depending on how much data is on the hard drive, the whole process should take a few hours to complete. This is why defragmenting is mostly done during off-peak hours when the computer is usually not being used.

Along with registry cleaning and anti-virus scans, disk defragmenting stands as another tool through which owners can take care of their units. Because these methods are both free and easy to use, owners have no excuse not to perform their responsibility to properly maintain their desktop computers.

If you are looking for a dll tool to restore missing corrupted files, you can download for free on http://www.dlltool.com/

Article Source: http://EzineArticles.com/?expert=Pete_F_Morgan

Photo courtesy of Noelsch
PROTECT YOURSELF IF A CATASTROPHIC EVENT OCCURS
Oct 24th, 2014 by aperio

Hurricane Sandy, Black Forrest fire, 6.0 earthquake hits Napa Valley – major catastrophes strike large population centers, business are damaged and even destroyed. Even after these major events, many of which make international news, numerous companies have all of their corporate data in the same building, and, in many cases, the same room.

No matter what the business goal or high level requirements, organizations must take action, intelligent action, to protect critical data. While this may seem like common sense, it’s amazing how often companies fail to perform even the most basic protection.

Nearly every business has a policy in place to cover disaster recovery, a catch all phrase to cover the need to restore data should trouble occur. In reality, disaster recovery is piece of a larger concept that includes high availability and business continuity. All of these concepts revolve around two basic ideas: recovery point objective (RPO) and recovery time objective (RTO).

There’s a tradeoff between potential for data loss, duration to recover, and cost. Certain businesses require high availability, the idea of near zero data loss and near zero downtime. Examples include financial industries, healthcare, and most organizations that utilize transactional actions in data processing. In other words, anytime one has a need to trace an action from start to finish there needs to be a way to have near zero data loss and more times than not, no downtime.

Business continuity is a step down on both RPO and RTO from high availability. The idea here is not about instantaneous recovery, it’s about making sure the business can continue to function after catastrophe hits. VMware and similar technologies using redundant infrastructure do a great job of providing business continuity; the key, how this environment is set up and over what distance, if any at all.

Disaster recovery covers both high availability and business continuity. Disaster recovery can also simply include a copy of data that sits on tape or a storage area network. The key here, where does that data reside. Having a copy of the information in the same location as the source data won’t offer protection against nearly every major catastrophe. This “old school mindset” really only protects a business from power outage, data corruption, or system related outages. Does your business implement this simplistic disaster recovery method?

Hurricane Sandy devastated the east coast in 2013 and a number of hospitals were directly impacted. One facility, a client at the time, shut their doors after the storm due to massive damage. I recall their data center was in the basement and water rose to the 5th floor; everything in the data center was destroyed. Without offsite data storage, not only would this hospital be out of business, they would have no way to run down their accounts receivable to obtain payment for services rendered.

While working with a global storage provider that was within a couple miles of the most devastating fire in Colorado history, I found out they have zero data protection outside of their server room. If the building burned down, as did so many others during this catastrophe, this company would’ve gone out of business. Data is key, protecting it is fundamental.

The recent 6.0 earthquake in Napa Valley shows the need for not only private industry to understand and implement realistic and attainable disaster recovery, Government must do the same. When certain disasters strike they can impact our infrastructure including gas, electricity, and transportation. Computer systems run large amounts of critical systems including transportation signals, lighting, and gas and electric power to the populace. Without proper disaster recovery with the necessary RPO and RTO in place, a community can suffer major impact. Government cannot only consider physical infrastructure when preparing for disaster, they have to understand the information technology impact as well.

A major impetus in creating this article revolves around the discrepancy between what a business believes they have in place versus what truly exists. So many organizations, often up to and including board of director requirements, create extensive disaster recovery plans. Unfortunately, oftentimes significant variance exists between what the business says they want, and what’s actually in place. Third party audits are critical to help close this gap. Before that audit can occur though, leadership has to know about and acknowledge the gap. Education is key; know there’s a problem and act!

Article by: Eric Jefferey,
Photo by: Sebastiaan ter Burg
SIDEBAR
»
S
I
D
E
B
A
R
«
»  Substance:WordPress   »  Style:Ahren Ahimsa