Google makes it easier to move data to its cloud
Google Cloud launches new Transfer Service to move on-premise data to the cloud
When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.
In an effort to help enterprises move data from their on-premise systems to the cloud,Google Cloudhas announced Transfer Service.
This new managed service is designed to handle large-scale transfers of billions of files and petabytes of data in the easiest way possible.
Googlehas launched similar services in the past including Transfer Appliance which allows companies to ship data to its data centers via FedEx and Google’s BigQuery service that automates data transfers fromSaaSapplications.
Google Cloud’s new Transfer Service will handle the heavy lifting and it can even validate the integrity of an organization’s data as it moves to the cloud. The agent also automatically handles failures and it will use as much available bandwidth as possible to help reduce transfer times.
Transfer Service for on-premise data
To get started using Google Cloud’s Transfer Service, you just have to install an agent on your on-premise servers, select which directories you want to copy and the service will take care of the rest. You can also use the Google Cloud console to monitor and manage your transfer jobs.
While archiving anddisaster recoveryare obvious use cases for this new service, Google is also targeting businesses that want to move their workloads and their attached data to the cloud.
Senior analyst at ESG, Scott Sinclair explained why implementing Transfer Service is much easier than developing a custom solution in ablog postannouncing the new service, saying:
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
“I see enterprises default to making their own custom solutions, which is a slippery slope as they can’t anticipate the costs and long-term resourcing. With Transfer Service for on-premises data (beta), enterprises can optimize for TCO and reduce the friction that often comes with data transfers. This solution is a great fit for enterprises moving data for business-critical use cases like archive and disaster recovery, lift and shift, and analytics and machine learning."
ViaTechCrunch
After working with the TechRadar Pro team for the last several years, Anthony is now the security and networking editor at Tom’s Guide where he covers everything from data breaches and ransomware gangs to the best way to cover your whole home or business with Wi-Fi. When not writing, you can find him tinkering with PCs and game consoles, managing cables and upgrading his smart home.
New fanless cooling technology enhances energy efficiency for AI workloads by achieving a 90% reduction in cooling power consumption
Samsung plans record-breaking 400-layer NAND chip that could be key to breaking 200TB barrier for ultra large capacity AI hyperscaler SSDs
NYT Strands today — hints, answers and spangram for Sunday, November 10 (game #252)