The Yottabyte Blog

vSphere 6.0 is BADASS. Not that I’ve played with it or anything. Ahem

The future is VMware’s to lose

4 Feb 2015 at 14:58, Trevor Pott

VMware has NSX, and NSX is good; they are absolutely ready to handle your most complex networking needs…even if those needs include layer 2 extention into vCloud Air. And thanks to vCloud Air and NSX, VMware currently has the only big name hybrid cloud solution that doesn’t suck. Though they had better watch out, as eager competitors like Yottabyte are building strong challengers.

VMware has VSAN, and VSAN is also good; VMware can provide enterprise storage for your datacenter themselves, or through their army of partners attaching through NFS 4.1, VVOLs, through kernel integration or as virtual hyperconverged offerings. Though here again VMware needs to be cautious; competitors like Nutanix, SimpliVity, Maxta, Scale Computing, Yottabyte, NimBOXX, HP, Gridstore, Atlantis, and many (many!) others are seeking a slice of the pie.

Many of these hyperconverged competitors offer not only solutions that work on VMware’s platform, but also on KVM, Hyper-V and there are even some talking about Xen. Many of these competitors offer features VMware currently does not. Those who are working in the KVM space are a special concern, as many of them are integrating with the kernel, removing VMware’s loudest marketing option from play. They are also legitimising that platform by putting real money into making management interfaces that don’t suck, and bringing hyperconvergence to Openstack.

This then brings us to VMware’s collision with Openstack. Openstack is all about giving service providers the ability to build their own cloud. VMware is (mostly) offering this ability too. But VMware also has its own cloud interests to protect, and it would really like it if everyone would (pretty please) dump their VMs into VMware’s hands…along with all that nice subscription money.

How is this all going to play out? Nobody knows; that is as much a function of international politics as anything else. Powerful forces – mostly notably the Americans and the Brits – are hell bent on stripping us of every last vestige of privacy. Companies (and individuals) who think about such things for more than a few seconds tend to have some problems with that. Especially since – and let’s be honest here – nobody can trust the Americans not to engage in industrial espionage.

Microsoft is busy trying to strangle its own channel, jacking up prices for service providers and partners and making the only hybrid cloud in a can offering they have inflexible and insanely priced. HP is busy tearing itself apart, Oracle is crafted from the fundamental evil at the center of the universe, both Amazon and Google don’t care, and IBM couldn’t find the future with two hands, a Sherpa and a GPS.

That leaves the open source community via Openstack, VMware and Dell to provide cloud software and services to the 6.5bn+ people who aren’t American or British. Mirantis, Cisco (via Metacloud) and Piston Cloud – just for starters – will show you that Openstack is ready to meet this challenge. Dell’s plans are a complete mystery (and good on ‘em for that,) which leaves us with VMware.

vSphere 6.0 is unrepentantly badass. vCloud Air is coming along nicely, and all of the other pieces of the puzzle are evolving steadily as well. This is great and wonderful, but the question that will hang over VMware for all of 2015 is how they will handle the hybrid cloud.

Will VMware go the Microsoft route and pay lip service to their channel and partners while silently working to kill them off? Or will VMware be both combatant and arms dealer, keeping prices low enough for service providers to build competitively priced VMware-based clouds?

To whom will “the other 6.5 billion” belong? With technology like VMware has demonstrated in vSphere 6.0, they are VMware’s customers to lose. ®

As a company with a focus on innovation and progress, we’re naturally always on the look-out for the next big thing. Not just the next big thing in software and technology, but the people who have the potential to revolutionize the worlds of software and technology.

With our commitment to finding and encouraging up and coming talent, we launched the Yottabyte Innovations in Software Scholarship. The scholarship was open to graduate and undergraduate students with a GPA of 3.0 or higher studying software programming, engineering, or technology at an accredited university in the United States.  Applicants were asked to tell us in 1,500 words or less what they think the next, most impactful innovations in software will be. To choose a winner, we took both their essays and their academic record into consideration. One grand prize winner will be awarded $1,000 and a second prize winner will receive $500.

We are proud to announce the first grand prize winner of the Yottabyte Innovations in Software Scholarship is Cory Alford. Cory received his Bachelor’s degree in Engineering Physics with a minor in Mathematics from the University of Illinois at Urbana-Champaign in 2013 and is now working towards his Master’s degree in Computer Science through the University of West Georgia’s online Master’s program. He currently lives in Los Angeles and works as a software engineer on defense research projects for the U.S. Government. Corey plans to pursue a Ph.D in Computer Science and would love to work for DARPA in their Information, Innovation, and Cyber department. We certainly admire Cory’s ambition and are very confident we will see great things from him.

As for what Cory envisions as the next, most impactful innovation in software, he sees eliminating the barriers caused by non-standardization and overhead in software development and distribution as the key to unlocking the door to real innovation. He believes in improving the portability of software platforms by streamlining the software transportation network by combining the processes of building, shipping, running, and administration of software distributions.

shutterstock_113544646

The term “software-defined” has been making its way around the technology world for a few years now. We have software-defined datacenters, software-defined storage, software-defined networking, software-defined computing, and now there’s software-defined everything. But many people still seem to be unsure what “software-defined” really means, hence why so many people are still so quick to dismiss the whole concept of being “software-defined” as nothing more than a buzzword, catch phrase, or marketing gimmick.

 

Since “software-defined” is a somewhat abstract concept, some people have difficulty defining something intangible.  Put simply, a software-defined datacenter is a datacenter and infrastructure in a bubble. With software-defined datacenters, the entire infrastructure is virtualized and delivered as a service. The entire datacenter can be controlled through software.

 

Software-defined datacenters are defined by two main characteristics: abstraction and pooled resources that can be very quickly reconfigured. If resources need to be re-provisioned, this can be done very quickly through software controls instead of having to completely reconfigure hardware or your infrastructure.

 

Another key feature of “software defined” is improved efficiency. When asked to explain the concept of “software-defined,” Brian Kirsch of the Milwaukee Area Technical College said, “Datacenters are no longer rooms that organizations show off to potential clients. Instead, they’re becoming a collection of servers running hypervisors. The real highlights that companies are showcasing are the agility, scalability, and redundancy that software-defined products give them.” The era of a huge datacenter being seen as impressive is over. Now large datacenters are often seen as inefficient and more of a hinderance than anything else. Software-defined datacenters aren’t about needing more to do more or having to make do with less. Software-defined datacenters are all about taking the resources you already have and getting the absolute most out of them without having to pay more to do so.

Posted in Blog

 

shutterstock_223733974

 

Tape has spent several decades as the medium of choice for data storage. It reigned supreme throughout the 1980s and 1990s, but with cloud storage growing in popularity and the price of flash storage declining, many people have been eager to declare tape a dead storage medium.  Although tape as a storage medium is on the decline, it isn’t dead yet; it will likely still be hanging around for quite some time still.

 

Regardless of whether you’re using tape or a cloud storage system, having your data backed  up off-site is always a wise idea. Off-site data backup creates an extra layer of security for your data in the event something catastrophic happens to your main facility. If your backed up data is stored on-site, there’s always the possibility that it will be lost or damaged if something catastrophic happens to your main facility, like a fire or natural disaster. But if you’re going to invest in backing your data up off-site, cloud storage has some significant advantages over tape.

 

Predictability

When businesses put their data into storage, they need their data will need to still be readable anywhere from 15 to 30 years later. When you use tape to backup your data, you’re storing your data on a relatively unpredictable medium. Because tape is a physical medium, it is susceptible to wear and tear. Tape also needs to be stored properly if it’s going to remain in good, readable condition. If tape isn’t stored properly, it can deteriorate or be damaged, potentially making data irretrievable in the event you need it.

 

Convenience

Your data belongs to you; you shouldn’t have to wait to have access it. Have you ever logged into your bank account to look up a bank statement from several months ago, only to be told you’ll have to make a special request for that statement and it’s going to take a while before you’ll be able to have it? This is something that happens when banks use tape to backup their data. Once the data is moved to tape, someone has to go back and look it up when it’s needed. With cloud storage, your data is there for you exactly when you need it.

 

Lower Costs

Since you won’t need a person to look up older information that has been archived to tape, cloud computing is going to inherently have some cost savings over tape storage. Since cloud storage solutions are becoming more and more affordable, the cost savings also make it easier for companies to create redundancy that is always desirable with data storage. Although cloud storage is less susceptible to damage the way tape is, that doesn’t mean cloud storage is completely fail safe. If there are issues with one cloud, you can afford to have another cloud to fall back on until the main cloud is gotten back online.

Posted in Blog

2014 was the first year in which the majority (51%) of workloads were performed in a cloud instead of a traditional IT space. Within the next five years, the amount of work performed in a public cloud is expected to grow by 44%. Clearly, cloud computing has taken the world by storm and it is here to stay. It’s not hard to see why it’s become so popular when you consider all the benefits if has to offer businesses:

Cost Efficiency

There’s a common misconception that cloud computing is prohibitively expensive. This isn’t true at all; 82% of companies report saving money after switching to the cloud and 14% said they were able to downsize their IT department or repurpose efforts to more strategic, business-impacting initiatives. Companies that make the switch to the cloud also save an average of 25% on personnel costs and 40% on consultations.

Scalability

Before cloud computing, provisioning resources was always a challenge for IT professionals. If you under-provision, applications aren’t capable of running optimally. If you over-provision, you’re wasting money on resources that aren’t actually being used. Cloud computing takes the work out of provisioning and you only pay for the resources you use. If you suddenly find yourself in need of more resources, it’s easy to add more. If those extra resources are no longer needed, it’s just as easy to scale back down again.

Greater Flexibility for Employees

Cloud computing offers great potential for giving your employees the flexibility to work remotely. Studies have shown that many working adults would be willing to take a pay cut in exchange for being able to telecommute for work. If you have employees working remotely, that means you can have smaller office environments, saving you money on rent.

Sometimes companies want to hire freelancers, but don’t have room in their office for extra people. Cloud computing provides the perfect solution to this problem since a company could hire freelancers and have them work remotely. Companies get the benefit of the extra help when they need it, but don’t have to find extra space in the office or pay for additional desks or computers.

Even if you don’t want your employees working remotely on a full-time basis, it can still be very beneficial in certain situations. Sometimes an employee can’t come into the office for reasons other than being sick, such as inclement weather, having to wait for a repairman, or car problems. An employee could work from home in cases like those, preventing total losses of productivity.

Easier Collaboration

If your employees frequently collaborate on files, cloud computing is exactly what you need. When a person works on a file, their work is automatically saved on the cloud and other employees who need to work with the same file can instantly access the updated file. There’s no longer a need for employees to be constantly e-mailing each other updated versions and nobody is left wondering whether or not they have the most recent version of a file.

The ease of collaboration is a huge benefit if you’re a business that has team members working in different states, regions or countries. Even if people are working on opposite sides of the globe, they can access files from one central location. A 2010 study by Frost and Sullivan found that companies that invested in collaboration technology saw a 400% return on investment .

Posted in Blog

software-defined-datacenterEfficiency. Flexibility. Simplicity. Affordability. These three features ought to be industry standards, but you’d be amazed how rarely they’re found together. With software-defined datacenters (SDDC’s), these are exactly the types of benefits you can get. Some people out there remain convinced that “software-defined” anything is nothing more than a buzzword or a fad because certain solutions are built on inferior and incomplete SDDC implementations. But we certainly believe the benefits of SDDC’s are not only available for use today , but represent the future of IT. We’ve built a better SDDC — yCenter.

Cost Efficiency

We believe in making cutting edge technology attainable because there’s no point in creating something great if nobody can afford to use it. With yCenter, you get free access to things that other companies charge an arm and a leg for, like data deduplication. KVM hypervisors are a standard part of yCenter, which are also included at no additional cost. Not only do you get all these features for no extra cost, Yottabyte yCenter is the only hyper-converged infrastructure software solution on the market to include networking capabilities, making it a truly complete SDDC solution.

Improved Performance

SDDC’s are all about taking what you already have and getting more out of it. You won’t pay more, the size of your datacenter won’t get any bigger, and you won’t see your energy bills skyrocket; the only thing that will grow is your datacenter’s performance. Yottabyte yCenter can give you five times the performance, for ⅕ the cost and in a fraction of the physical space of a traditional datacenter.

Easy Recovery

Nobody likes to think about things going wrong with their datacenters. But things do go wrong sometimes and you don’t want to be caught unprepared if it does. yCenter is an incredibly resilient SDDC and makes getting back on track a not-so-big ordeal. A datacenter with zero downtime is a very real possibility! The low-cost, modular nature of yCenter makes it easy to have redundant systems with synchronized data, eliminating the possibility of a single point of failure. yCenter also offers continuous data protection with our snap-sync technology, making it a breeze to restore data to an earlier point in time.

Flexibility & Simplicity

SDDCs make IT management a much simpler and more flexible process than it ever has been before. As an example, if you need to test an application or patch, you can easily clone your entire datacenter (in under a second) and try the update out without impacting any of your production systems. You’ll have a faster turnaround on requests and since your IT staff won’t have to spend as much time on maintenance, they’ll be able to do more work in other, more business-impacting capacities.

Posted in Blog

hybrid-cloud-yottabyteCloud computing and storage used to be broken down into two different types: private clouds and public clouds. With public clouds, storage space and computing resources are made available by a service provider and a customer is buying just part of the service provider’s resources. Private clouds are created exclusively for the use of one company or organization.

Both Public and private clouds each have their own benefits. Public clouds tend to be more affordable and are easily scalable to a business’ needs, but private clouds (or virtual private clouds) offer the security of having data stored in an infrastructure created just for you, behind your firewall. More and more businesses are seeing the advantages each of them offers and want to take advantage of both of them. This is where the hybrid cloud comes into play.

What Is the Hybrid Cloud?

A hybrid cloud is a type of cloud architecture where a private cloud (behind your firewall) and a public cloud (service provider) are both used and connected through a secure connection. The clouds exist as two separate entities, but their connection enables them to create a seamless work environment that can leverage each other.

Who Can Benefit From the Hybrid Cloud?

Hybrid clouds are an option for businesses where the need for high performance computing and data storage is big, but the budget and physical space available is small. Some businesses like to keep currently used files on a private cloud, but use public clouds for archive storage. In other cases, some or all of a company’s servers may run in the public (virtual private) cloud. For example, hybrid clouds have become popular in the healthcare industry. Due to HIPAA regulations, there is information that needs to be stored in a private cloud, but other less sensitive applications can be sourced via a public cloud. They’ve also become particularly popular with financial institutions, law firms, retail establishments, and the travel industry.

The Future of the Hybrid Cloud

IT research company Gartner estimates that by the end of 2017, nearly half of all large enterprises will have deployed hybrid clouds.There’s simply no need to settle for one or the other if you can easily benefit from both. There’s also no need to have separate hybrid cloud and datacenter providers. Yottabyte is the first hybrid cloud provider to also offer a virtual datacenter, making it a truly complete solution for your data storage, computing and networking needs.

Posted in Blog

BLOOMFIELD TOWNSHIP, Mich.Sept. 18, 2014 /PRNewswire/ – Yottabyte, a rising star in software-defined data storage and datacenters, and creators of yCenter and yStor, today announced that Chemical Bank, a growing Community bank headquartered in Michigan, has deployed the yStor software-defined storage solution to help maintain its growing IT infrastructure. yStor’s advanced data management features are driving new levels of performance and flexibility and reliability for the bank contributing to a more flexible infrastructure to support its growth.

Logo - http://photos.prnewswire.com/prnh/20140917/146771

Located in Midland, Michigan, Chemical Bank is the second largest bank headquartered in Michigan with a long history of serving individuals, families and businesses across the state.  Chemical Bank continues to expand and grow their footprint in Michigan most recently with the partnership Chemical Bank announced with Northwestern Bankcorp, which will add 25 new locations across 11 newMichigan counties once the planned merger closes.

Chemical Bank’s aggressive growth strategies had pushed data retention and backup needs to a head.  The current tape backup system was maxed out and in need of an expensive upgrade.  Due to regulatory requirements, multiple copies of all data needed to be retained, causing even more data explosion. Chemical Bank needed a flexible, scalable, reliable enterprise storage system to manage its massive data growth and enhance its disaster recovery capabilities.

“Our legacy solution was contingent on hardware to make it work. Scaling the legacy solution would require we add more hardware and more fixed cost, Yottabyte solved our business problem by archiving our data via their software on to commodity hardware at a cheaper variable cost point,” said Tad Sumner, Vice President, Information Technology, Chemical Bank.

Chemical Bank chose yStor, Yottabyte’s software-defined storage solution over other storage solutions to create an innovative storage infrastructure that they can count on to grow as they grow.  They chose yStor for its combination of technical functionality and an economic product offering.  In addition, the yStor’s systems highly reliable architecture and easy-to-us dashboard functionality simplifies the deployment and ongoing SAN administration.

“In our analysis of back-up solutions, Yottabyte distinguished itself with its software approach to back-up and archiving,” said Sumner.  “While other suppliers were more about the sale, Yottabyte spent a great deal of time getting to know our business and our storage needs. They were then able to help us craft a solution with their software that made the most sense and was scalable for future growth.”

About Yottabyte
Who are we? We’re just like you – frustrated by the limitations of modern IT infrastructure, overwhelmed by the complexity and rigidity of legacy technology solutions, and left wanting more and better options. We’re a team with decades of experience running tech companies, and happened to be in a position to do something about it.  We’re headquartered in Michigan, which is the place to be from if you want to change the rules of the old, entrenched IT game. We provide affordable datacenter capabilities in a platform that can be out-tasked entirely or managed by an IT guy, not an engineer or scientist.

More information is available at www.yottabyte.com.

For more information or to schedule an interview, please contact:

Duane Tursi
Yottabyte, LLC
1-888-630-BYTE
Email

SOURCE Yottabyte LLC

RELATED LINKS
http://www.yottabyte.com

by  on 

Posted in News via CRN

Imagine your company’s financial health revolved around you keeping a large jar on the CEO’s desk filled with jelly beans. No doubt your first move would be to keep an adequate supply of jelly beans somewhere on the premises.

Sounds pretty simple so far, doesn’t it? Now imagine that some weeks the CEO really digs into the jelly beans while other times he/she barely touches them at all.  There’s no pattern to consumption so you have to plan to have enough around for the peak times, even if you don’t need them for weeks.

Finally, imagine that the number of jars you have to keep filled expands each week from one to two to four and so on, and that you need to double the number of jelly beans on hand just in case the first batch gets ruined. Pretty soon you’ll have a lot of money invested not only in the jelly beans themselves but the floor space to store them and the equipment to move them from wherever they’re being stored to the CEO’s office.

That, in a nutshell, is the situation small and medium enterprises (SMEs) find themselves in today with data. What started out to be a manageable task in the days when the sum of an enterprise’s data could be measured in gigabytes has become an expensive, unwieldy mess now that petabytes of data have become typical. And it’s an issue that shows no signs of stopping.

In 2009, analyst group IDC estimated the total amount of data in the world was nearly 800,000 petabytes. (Just for perspective, a petabyte is one million gigabytes; if you had a petabyte of DVDs the stack would reach to the moon and back.) In 2010, that number increased to 1.2 million petabytes. By 2020, IDC predicts the total data will be 35 zettabytes, which is 44 times the level it was in 2009; that stack of DVDs would now reach halfway to Mars.

With such exponential growth, SMEs are finding that the old model of building Storage Area Networks (SANs) and keeping all their data on servers inside the enterprise – plus a full backup,
plus a mirror of it for disaster recovery protection – is no longer practical or affordable. Server virtualization offers temporary relief, but it merely treats the symptom. At the rate of data expansion it won’t be long before even a virtualized internal data center is overwhelmed and cost-prohibitive.

The simple truth is there’s no reason to deal with all the complexity and cost that comes with storing data on internal, hardware-based systems that are expensive to build, difficult to manage and ultimately prone to failure despite your best efforts. After all, data is simply the raw material for most enterprises in the way polypropylene is a raw material for manufacturers. The value in the data comes from the applications that turn it into information. Those applications are where the organization should be placing its technology focus.

That’s why more enterprises are now looking toward a software-based solution using the “cloud” for their data storage requirements. Rather than building complex SANs in massive, power-hungry facilities to house the data, and then hiring costly experts to manage the entire operation, they’re making storage the responsibility of a third-party specialist and simply tapping into the data as-needed.

One of the biggest reasons is flexibility in terms of cost per byte. In many industries, data needs ebb and flow based on seasonality, the economy or other factors. Think of our jelly bean example. Enterprises cannot afford to have less capacity than required, so they gear up for peak levels; when data needs ebb, however, they’re still paying for peak capacity even though they’re not using it.

With cloud-based storage, enterprises only pay for what they use. This pay-as-you-go model is in line with how they consume other commodities, such as raw materials, heating/cooling, electricity and so on. In addition, the cost of managing the data, backing it up, having a cold/warm/hot mirror site, etc. is built into the rate structure, further improving the value of going to the cloud instead of trying to build everything yourself.

Another reason to move to the cloud is to provide your employees with easier access to information. With cloud storage, data can be accessed anywhere, anytime, on any device by anyone with the proper authorization. All they need is a connection and they’re ready to work – no need to build SANs for each location or copy files and carry them. Given some of the recent
news stories about millions of confidential customer records being compromised due to a stolen a laptop that alone is a good reason to consider it.

If the cloud sounds like a good choice, one thing that’s important to understand is that there are actually many cloud models – at least five or six, depending on who is describing it. The most well-known, of course, is the Internet, also known as the public cloud.

The Internet cloud is great for handling your personal email or calendar, downloading movies, doing your taxes and other small tasks. But for storing and moving data around an enterprise it’s not the best choice. No one is managing the Internet end-to-end, so if there’s a breakdown it could take hours before it’s rectified. Ask your CFO how much being without access to your data for a few hours would cost. Security is also an issue. The cost of data stolen as it passes through the public Internet – both hard costs and those to your enterprise’s reputation – is nearly incalculable.

A better option for businesses is a hybrid cloud that offers the scalability and access of the public cloud with the speed and security of an internal SAN. This hybrid cloud can be delivered via infrastructure as a service, a cloud storage appliance or even a virtualization machine appliance.

With these options, the software that drives them is designed to optimize the delivery of data across a distributed enterprise while using low-cost hardware. This model allows you to operate out of four Tier-1 data centers rather than having to rely on a single, expensive Tier-4 data center with all of its inherent loss-of-service risk. Having a multiple-site configuration across different geographies delivers redundancy and failover capabilities that would normally be beyond the budget of an SME.

The hybrid cloud also improves security by taking data off the public grid. Options also exist to generate and keep private your own encryption key, although you may want to think twice about that because if you lose that key it’s gone – along with your access to the data.

Another advantage is that automatic hardware refreshes and software maintenance are built into the cloud model. Your cloud storage provider takes care of all of that, freeing your IT department to perform work that adds more value to the enterprise.

If you are considering moving storage to the cloud, be sure you check out the potential provider, the hardware they use and the options they offer thoroughly. There is no one cloud service that fits all, so it’s important to be sure you select one that will work with the way your enterprise works. It’s also important to make sure you have the network infrastructure to handle the increased load. If it’s been a couple of years since your last technology refresh, you may want to check your current capacity versus the expected flow of data to be sure you’re ready for the move.

It’s highly unlikely the success of your organization depends on the flow of jelly beans to the CEO – although stranger things have happened. But it does depend on the smooth flow of data to the users who need it.

Moving your data from internal storage to the cloud, particularly a hybrid cloud, can help you store and deliver that data far more cost-effectively while focusing your IT resources on the activities that create value instead. Which is the point of having data in the first place.

Posted in Blog