AWS S3 Cloud Storage Crashed Because of Human Error
The cloud giant admitted that a human error is to blame for the AWS 4.5 hour long outage which impacted millions of customers last Tuesday. Issues started at 11.35am PST and lasted until 2.08pm PST, on February 28th impacting all AWS services running out of its North Virginia, US, data centre.
In their statement AWS said
How One AI-Driven Media Platform Cut EBS Costs for AWS ASGs by 48%
“We continue to experience high error rates with S3 in US-EAST-1, which is impacting various AWS services.”
Among the impacted services were their on-premise-to-cloud connection mechanism, Storage Gateway, relational database Amazon RDS, Data Pipeline, Elastic MapReduce and many other. In an explanatory statement, AWS said the issues were caused by human error. Basically, an engineer pressed the wrong button when trying to take a small number of servers offline to make some fixes.
“One of the inputs to the command was entered incorrectly and a larger set of servers was removed than intended. The servers that were inadvertently removed supported two other S3 subsystems.”
Amazon apologized to its users and said they are taking necessary measures to make avoid similar inconveniences in the future. However, the crush triggered a wave of comments from industry experts and competitors. The outage has shown the massive footprint that AWS has, but also showed how much they need a hybrid component to their solution. Hybrid remains the best pragmatic approach for businesses who are working in the cloud, protecting them from downtime, money lost, and a number of other problems caused by outages like this one.
Read More:
TechCrunch, Cloud Pro
Google And Coursera Announced Cloud Training Partnership
Tech giant Google and online course platform Coursera announced a partnership aimed at accelerating the acquisition of cloud knowledge.
The courses will be taught by Google’s experts and are conceived to help both individuals and businesses. Louise Byrne, head of cloud training delivery at Google, said attendees will gather all the required skill and learn from the very people that developed the platform. The training courses will be addressing many cloud-related topics like operations, data analysis, cloud fundamental and machine learning. The first available course will be Big Data and Machine Learning within the Data Engineering on Google Cloud Specialization topic.
“It’s a major milestone in Coursera’s journey toward closing the global skills gap and empowering learners with career-relevant skills.”
Leah Belsky, VP of Global Business Development at Coursera.
Coursera’s cloud courses will be adapted to all levels of knowledge, ranging from beginners who are just starting to understand cloud, up to advanced cloud engineers. The company stated their courses will be especially useful for current and aspiring IT professionals and data engineers.
Read More:
Akamai Ion Introduces Machine Learning Optimizations and Mobile SDK
Akamai released a new version of their Akamai Ion product. The web performance solution aims at providing an improved mobile experience through the addition of their Mobile App Performance SDK and the implementation of machine learning.
While Akamai has a proven track record as a leader in the CDN market, their grasp on the mobile sector has been relatively thin. With the newest release of Ion, Akamai will look to improve their position by tackling the challenges of mobile content delivery. Their main weapons are the two new additions: Automated Performance Optimization and Cellular Optimization.
“We believe this release of Ion marks the beginning of a new kind of powerful performance optimization”
Ash Kulkarni, Senior VP and General Manager of the Akamai Web Performance Business Unit.
The newest version of Akamai Ion offers a mobile SDK that enables developers to pre-position content to ensure consistency in situations of unclear network conditions and lost connections. The SDK also contains SureRoute for Cellular, a feature for reducing latency over the last mobile mile while providing custom metrics.
Read More:
Salesforce CEO Says Brexit Prevented Higher Revenue
Cloud computing giant Salesforce reported a full year revenue of revenue of $8.39 bn, with the last quarter reaching $2.29 bn at a 27% YoY growth. CEO Mark Benioff expressed his delight with the results while putting emphasis on Salesforce’s consistent growth
“In 2004, when we went public, we had $46 mil in quarterly revenue. And now in the fourth quarter alone, we delivered $2.3 bn in revenue and, for this fiscal year, we are guiding to more than $10 bn.“
Along with the excitement for the positive achievements, Benioff slammed on the Brexit situation claiming that the numbers would be even higher if it wasn’t for Brexit and subsequent instability of the British Pound. In their earning forecast for the next quarter Salesforce projected a 22-23% YoY growth in expected revenue, which translates to between $2.34 bn and $2.35 bn.
Read More:
IBM Q To Enable Commercial Quantum Computing
IBM announced their plans to build the first universal quantum computers intended for business and scientific use. However, official release dates have not yet been revealed. Either way, these ambitions are worth noting since quantum computing is expected to revolutionise the way large-scale computing is performed.
The product, called IBM Q, will likely deliver its quantum systems via IBM’s cloud Platform Bluemix. The company stated their initiative aimed at providing commercial quantum computing will be the very first of this kind within the industry.
“Classical computers are extraordinarily powerful and will continue to advance and underpin everything we do in business and society. But there are many problems that will never be penetrated by a classical computer. To create knowledge from much greater depths of complexity, we need a quantum computer”
Tom Rosamilia, senior VP of IBM Systems
Quantum computing will definitely make Analytics and Big Data disciplines more valuable and actionable. However, the security community has high concerns over the technology as it could render current cryptographic practices completely obsolete, meaning that new security technologies will have to be developed in order to offer adequate levels of data protection.
Read More: