@VMware (NYSE:VMW): @MorganStanley Reiterates Overweight Rating Today, Has a Target of $123/Share September 28, 2017 - By Ellis Scott  Investors sentiment decreased to 1.21 in Q2 2017. Its down 0.33, from 1.54 in 2017Q1. It fall, as 45 investors sold VMware, Inc. shares while 123 reduced holdings. 60 funds opened positions while 143 raised stakes. 86.40 million shares or 2.10% more from 84.63 million shares in 2017Q1 were reported.  Pacad invested 0.15% of its portfolio in VMware, Inc. (NYSE:VMW). Commonwealth Equity Svcs Inc owns 2,702 shares. Aviva Public Limited, a United Kingdom-based fund reported 42,095 shares. Commonwealth Bank Of Australia holds 26,868 shares or 0.02% of its portfolio. Cibc Asset Management accumulated 16,856 shares. Baystate Wealth Management Llc owns 0.02% invested in VMware, Inc. (NYSE:VMW) for 1,200 shares. Gmt Corporation holds 1.21M shares or 2.28% of its portfolio. Royal National Bank & Trust Of Canada holds 0.04% or 880,908 shares in its portfolio. Hsbc Holdings Public Ltd Co holds 130,270 shares or 0.02% of its portfolio. Cornerstone Cap Limited Liability Com has invested 0.02% of its portfolio in VMware, Inc. (NYSE:VMW). Aberdeen Asset Mngmt Public Ltd Co Uk has 0% invested in VMware, Inc. (NYSE:VMW) for 12,335 shares. Smithfield Trust stated it has 1,886 shares or 0.02% of all its holdings. Financial Bank Of Montreal Can owns 160,434 shares. Comerica Inc owns 4,648 shares or 0.06% of their US portfolio. Prudential Incorporated reported 47,648 shares. Since May 5, 2017, it had 0 insider purchases, and 6 insider sales for $238.98 million activity. Durban Egon also sold $59.40 million worth of VMware, Inc. (NYSE:VMW) shares. SLP Denali Co-Invest GP – L.L.C. had sold 666,354 shares worth $59.40M on Friday, May 5. 666,354 shares were sold by DELL MICHAEL S, worth $59.40M. The insider POONEN SANJAY sold $322,735. The insider Carli Maurizio sold 11,237 shares worth $1.06 million. 666,354 shares were sold by Dell Technologies Inc, worth $59.40 million. VMware (NYSE:VMW) Rating Reaffirmed They currently have a $123 target price on VMware (NYSE:VMW). The target price by Morgan Stanley would suggest a potential upside of 13.52 % from the company’s close price. This has been revealed in analysts report on 27 September. Investors sentiment decreased to 1.21 in Q2 2017. Its down 0.33, from 1.54 in 2017Q1. It fall, as 45 investors sold VMware, Inc. shares while 123 reduced holdings. 60 funds opened positions while 143 raised stakes. 86.40 million shares or 2.10% more from 84.63 million shares in 2017Q1 were reported. Pacad invested 0.15% of its portfolio in VMware, Inc. (NYSE:VMW). Commonwealth Equity Svcs Inc owns 2,702 shares. Aviva Public Limited, a United Kingdom-based fund reported 42,095 shares. Commonwealth Bank Of Australia holds 26,868 shares or 0.02% of its portfolio. Cibc Asset Management accumulated 16,856 shares. Baystate Wealth Management Llc owns 0.02% invested in VMware, Inc. (NYSE:VMW) for 1,200 shares. Gmt Corporation holds 1.21M shares or 2.28% of its portfolio. Royal National Bank & Trust Of Canada holds 0.04% or 880,908 shares in its portfolio. Hsbc Holdings Public Ltd Co holds 130,270 shares or 0.02% of its portfolio. Cornerstone Cap Limited Liability Com has invested 0.02% of its portfolio in VMware, Inc. (NYSE:VMW). Aberdeen Asset Mngmt Public Ltd Co Uk has 0% invested in VMware, Inc. (NYSE:VMW) for 12,335 shares. Smithfield Trust stated it has 1,886 shares or 0.02% of all its holdings. Financial Bank Of Montreal Can owns 160,434 shares. Comerica Inc owns 4,648 shares or 0.06% of their US portfolio. Prudential Incorporated reported 47,648 shares. Since May 5, 2017, it had 0 insider purchases, and 6 insider sales for $238.98 million activity. Durban Egon also sold $59.40 million worth of VMware, Inc. (NYSE:VMW) shares. SLP Denali Co-Invest GP – L.L.C. had sold 666,354 shares worth $59.40M on Friday, May 5. 666,354 shares were sold by DELL MICHAEL S, worth $59.40M. The insider POONEN SANJAY sold $322,735. The insider Carli Maurizio sold 11,237 shares worth $1.06 million. 666,354 shares were sold by Dell Technologies Inc, worth $59.40 million. VMware, Inc. (NYSE:VMW) Ratings Coverage Among 46 analysts covering VMware Inc (NYSE:VMW), 21 have Buy rating, 0 Sell and 25 Hold. Therefore 46% are positive. VMware Inc has $130 highest and $50 lowest target. $100’s average target is -7.71% below currents $108.35 stock price. VMware Inc had 134 analyst reports since July 22, 2015 according to SRatingsIntel. Oppenheimer maintained the shares of VMW in report on Thursday, August 24 with “Buy” rating. BMO Capital Markets maintained the shares of VMW in report on Friday, August 25 with “Market Perform” rating. On Tuesday, April 4 the stock rating was initiated by Berenberg with “Hold”. Monness Crespi & Hardt downgraded the stock to “Neutral” rating in Wednesday, October 21 report. The stock has “Neutral” rating by Susquehanna on Wednesday, October 21. The rating was maintained by Bernstein with “Market Perform” on Tuesday, July 26. The rating was maintained by Cowen & Co with “Hold” on Monday, August 14. The stock of VMware, Inc. (NYSE:VMW) has “Hold” rating given on Monday, January 30 by Maxim Group. Mizuho maintained the shares of VMW in report on Thursday, October 27 with “Neutral” rating. RBC Capital Markets maintained VMware, Inc. (NYSE:VMW) on Wednesday, August 23 with “Outperform” rating. About 248,505 shares traded. VMware, Inc. (NYSE:VMW) has risen 57.35% since September 28, 2016 and is uptrending. It has outperformed by 40.65% the S&P500. VMware, Inc. is an information technology company. The company has market cap of $44.31 billion. The Firm is engaged in development and application of virtualization technologies with x86 server computing, separating application software from the underlying hardware. It has a 34.14 P/E ratio. The Firm offers various products, which allow organizations to manage IT resources across private clouds and multi-cloud, multi-device environments by leveraging synergies across three product categories: Software-Defined Data Center (SDDC), Hybrid Cloud Computing and End-User Computing (EUC). More notable recent VMware, Inc. (NYSE:VMW) news were published by: Investorplace.com which released: “Why VMware, Inc. (VMW) Stock Is a High Growth, Low Risk Play” on September 26, 2017, also Investorplace.com with their article: “A Breakout Looms for VMware, Inc. (VMW) Stock” published on September 13, 2017, Cnbc.com published: “VMware executive: As the cloud grows, so does the ‘new hardware economy'” on September 26, 2017. More interesting news about VMware, Inc. (NYSE:VMW) were released by: Bloomberg.com and their article: “How VMware’s Partnership With Amazon Could End Up Backfiring” published on September 01, 2017 as well as Seekingalpha.com‘s news article titled: “VMware’s Growing Partnerships” with publication date: August 30, 2017.
TechNewSources is a one stop shop for all the latest, datacenter TechnNews you can use.
Dell, EMC, Dell Technologies, Cisco,
Thursday, September 28, 2017
Dell EMC Channel Chief John Byrne Offers Carrot, Stick To Spur Services Growth
Partners have heeded the call to sell more services, but @DellEMC Channel Chief @JohnByrne isn't about to take the pressure off. He's reminding solution providers that they have to hit a minimum level of services sales to maintain program tier levels next year. "We’re basically doing the carrot and the stick," Byrne (pictured) said in an interview with CRN. "We'll reward you for attaching our services, but we also said your tier threshold for next year is going to require hitting a minimum gate for services." Partners in Dell EMC's top Titanium tier, for example, have a $6.5 million services quota for the current fiscal year. A top executive at one Titanium partner told CRN that he's already informed Dell EMC channel leadership that his company is not likely to hit that goal. "They're definitely making a push," the executive said, "although, that's a pretty high bar." The services requirement is on top of an already daunting climb for solution providers seeking to maintain status after being grandfathered into tiers – Gold, Platinum, and Titanium – that carry much heavier revenue requirements for legacy Dell partners than they did in the past. [Sponsored Suggested Post: Free IoT Virtual Conference Learn how to monetize IoT solutions. Discover new IoT technologies. Build strategic partnerships.] So far, Byrne said, partners seem to have taken the message to heart. Dell Technologies reported $9.8 billion in services revenue in the first half. "We said start attaching our services. The good news is our services growth in the channel was close to double digits," through the first half of the company's fiscal year. Still, establishing a services business can be expensive and challenging for solution providers, especially smaller VARs that may not have the budget power to do so. Selling services only works for solution providers that can achieve high rates of utilization among customers. Without that, labor, and other costs shouldered by resellers can complicate the equation. "We're a big fan, and we use them pretty consistently, just attach them to everything we do by default," said Michael Tanenhaus, president of Mavenspire, an Annapolis, Maryland, solution provider that works with Dell EMC. "It's a way to help people partner better," Tanenhaus said. "If everybody's on the same page with how to go to market together, it makes it easier to work together." Tanenhaus said there's plenty of room for Dell EMC's services business to grow in the channel, too, noting that while Dell had a broad range of services offerings available to the channel before its $58 billion acquisition of EMC last year, EMC did relatively little services business. "It's a cultural difference, and we haven't gone through the full arc of people knowing about it or testing it out, so it's still growing, and it's still in its initial arc," Tanenhaus said. "Legacy EMC was little tiny SKUs that went out to the partners for services, and EMC was not gaining margin on those programs."
The Hot Stock: Move Over For Micron
@MicronTechnology (MU) surged to the top of the S&P 500 after posting blow out fiscal fourth quarter results and an upbeat outlook for the current quarter.  Micron's "3D Xpoint" solid-state memory technology for servers. Micron gained more than 8.5% today to close at just over $37 a share, making it the day’s best-performing stock. The shares continued to edge higher after the closing bell. My colleague Tiernan Ray dove into the semiconductor company’s performance earlier today, and interviewed CEO Sanjay Mehrota. Analysts were encouraged by rising margins, as well as the company’s outlook for a nearly 50% increase in capital spending by Micron to a planned $7.5 billion this year.  Barclays Blayne Curtis called it “a beat and raise” quarter with healthy trends that will likely continue into next year. FY18 capex guidance of $7.5B (35-45% allocated to each DRAM /Flash) was on the high side but MU plans to keep bit growth in line with the market at +50%/+20% in NAND/DRAM respectively, similar to what we heard on our Asia trip several weeks ago. The absolute amount of capex dollars is going up for all, but the company is progressing with their technology development and in general not trying to outgrow the market (likely remains undersupplied through next year, particularly in NAND). Even with the higher capex, MU plans to be net cash positive exiting FY18. Net net, the story continues to have additional legs as the market remains undersupplied into FY18 driving both revenue and margins. And Credit Suisse’s John Pitzer raised his price target on the stock from $40 a share to $50 a share. Big CapEx A Big Positive: For too long MU has been content as a fast follower (300 mm, Cu interconnect) – which has hurt MU’s costs. Even with CapEx of $7.5 bb, CFO endorsed possibility of net cash positive in FY18 implying cash flow from ops. of ~$13.2 bb versus our current estimate of $12.2 bb.
http://www.barrons.com/articles/the-hot-stock-move-over-for-micron-1506547282
Open-source community pushing big data into AI realm
What’s the surest way to advance a technology in a short time? Give it away — to an open-source community. Seminal big data software library @Apache @Hadoop gained momentum in #opensource, and today, most disruptive #bigdata development is springing from open source as well. “If people have the community traction, that is the new benchmark,” said John Furrier (@furrier) (pictured, left), co-host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio. This is evident at SiliconANGLE’s and theCUBE’s BigData NYC 2017 event, where Furrier and co-host James Kobielus (@jameskobielus) (pictured, right) discussed the community edge. Yahoo Inc. just open-sourced its big data search and recommendation software Vespa, following its hugely popular 2006 Hadoop contribution. It clearly believes Vespa can evolve via open-source developer brains just as Hadoop did. “As the community model grows up, you’re starting to see a renaissance of real creative developers,” Furrier said. These developers are not just working out implementation kinks; they’re innovating at a level that makes a difference for applications. “Real creative competition — in a renaissance, that’s really the key,” Furrier stated. The renaissance will be automated Much new development branches out from big data per se, into artificial intelligence, machine learning and internet of things. “Data professionals and developers are moving toward new frameworks like TensorFlow,” Kobielus said. TensorFlow is Google’s open-source deep learning framework. Caffe and Theano are additional open-source deep learning technologies with bustling communities around them. Some of the most exciting work happening in open-source (and at Stanford University) revolves around automating the acquisition of data needed to train machine learning models. Many would like to see deep learning tools and methods operationalized, enabling what some call DataOps or InsightOps (IBM’s term), Kobielus pointed out. “I think what are coming into being are DevOps frameworks to span the entire life cycle of the creation and the training and deployment and iteration of AI,” he said. Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of BigData NYC 2017.
https://siliconangle.com/blog/2017/09/27/open-source-community-pushing-big-data-ai-realm-bigdatanyc/
Supermicro Introduces Portfolio of GPU Optimized systems for NVIDIA Tesla V100 GPUs
SAN JOSE, California, Sept. 27, 2017 — @SuperMicro Computer, Inc. (NASDAQ: SMCI), a leader in enterprise computing, storage, and networking solutions and green computing technology, today announced support for @NVIDIA #Tesla #V100 PCI-E and #V100SXM2 GPUs on its industry leading portfolio of GPU server platforms. For maximum acceleration of highly parallel applications like artificial intelligence ( #AI ), #deeplearning, #autonomousvehiclesystems, energy and engineering/science, Supermicro’s new 4U system with next-generation NVIDIA NVLink is optimized for overall performance. The SuperServer 4028GR-TXRT supports eight NVIDIA Tesla V100 SXM2 GPU accelerators with maximum GPU-to-GPU bandwidth for important HPC clusters and hyper-scale workloads. Incorporating the latest NVIDIA NVLink GPU interconnect technology with over five times the bandwidth of PCI-E 3.0, this system features an independent GPU and CPU thermal zoning design, which ensures uncompromised performance and stability under the most demanding workloads. Similarly, the performance optimized 4U SuperServer 4028GR-TRT2 system can support up to 10 PCI-E Tesla V100 accelerators with Supermicro’s innovative and GPU optimized single root complex PCI-E design, which dramatically improves GPU peer-to-peer communication performance. For even greater density, the SuperServer 1028GQ-TRT supports up to four PCI-E Tesla V100 GPU accelerators in only 1U of rack space. Ideal for media, entertainment, medical imaging, and rendering applications, the powerful 7049GP-TRT workstation supports up to four NVIDIA Tesla V100 GPU accelerators. “Supermicro designs the most application-optimized GPU systems and offers the widest selection of GPU-optimized servers and workstations in the industry,” said Charles Liang, President and CEO of Supermicro. “Our high performance computing solutions enable deep learning, engineering and scientific fields to scale out their compute clusters to accelerate their most demanding workloads and achieve fastest time-to-results with maximum performance per watt, per square foot and per dollar. With our latest innovations incorporating the new NVIDIA V100 PCI-E and V100 SXM2 GPUs in performance-optimized 1U and 4U systems with next-generation NVLink, our customers can accelerate their applications and innovations to help solve the world’s most complex and challenging problems.” “Supermicro’s new high-density servers are optimized to fully leverage the new NVIDIA Tesla V100 data center GPUs to provide enterprise and HPC customers with an entirely new level of computing efficiency,” said Ian Buck, vice president and general manager of the Accelerated Computing Group at NVIDIA. “The new SuperServers deliver dramatically higher throughput for compute-intensive data analytics, deep learning and scientific applications while minimizing power consumption.” With the convergence of Big Data Analytics, the latest NVIDIA GPU architectures, and improved Machine Learning algorithms, Deep Learning applications require the processing power of multiple GPUs that must communicate efficiently and effectively to expand the GPU network. Supermicro’s single-root GPU system allows multiple GPUs to communicate efficiently to minimize latency and maximize throughput as measured by the NCCL P2PBandwidthTest. About Super Micro Computer, Inc. (NASDAQ: SMCI) Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology is a premier provider of advanced Server Building Block Solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and Embedded Systems worldwide. Supermicro is committed to protecting the environment through its “We Keep IT Green” initiative and provides customers with the most energy-efficient, environmentally-friendly solutions available on the market.
Wednesday, September 27, 2017
Rubrik Announces Alta 4.1 Release with Rich Enhancements to Multi-Cloud Data Management
#Rubrik delivers a complete enterprise data management platform for all #Microsoft apps; 4.1 introduces support for @Microsoft #AzureStack and #CloudOn for Microsoft Azure to recover and instantiate apps in the cloud - New @GoogleCloudPlatform integration completes support for all major cloud providers - Latest release broadens support to all government infrastructure offerings from #AmazonWebServices ( @AWS ), including advanced secure services like Commercial Cloud Services (C2S)  NEWS PROVIDED BY Rubrik, Inc. 09:01 ET PALO ALTO, Calif., Sept. 27, 2017 /PRNewswire/ -- Rubrik today announced the 4.1 release of its Cloud Data Management platform. Rubrik Alta 4.1 completes support for all major cloud providers with a new Google Cloud Platform integration. The release also broadens support for Amazon Web Services (AWS) and Microsoft Azure, making it easier than ever for companies, governments and nonprofits to keep and use their data anytime, anywhere. According to IDC, virtually all new application deployments will be made in a cloud-oriented architecture within the next two years and 47 percent of all organizations plan to purchase a copy data management solution in that timeframe. "Cloud adoption is top of mind for CIOs on the hunt for savings, scalability and increased business efficiency," says Arvind Nithrakashyap, CTO of Rubrik. "With our latest release, every enterprise can mobilize their apps and data to any public cloud environment. Version 4.1 of Rubrik Cloud Data Management continues the rapid expansion of our cloud solution portfolio. Now, every enterprise can mobilize their apps and data to any cloud environment and enjoy instant data access, regardless of underlying infrastructure." A Complete Enterprise Data Management Platform for All Microsoft Enterprise Apps Rubrik can be deployed to orchestrate all critical data management functions – backup and recovery, replication and DR, archival, search, and analytics – for all Microsoft apps; Windows, Hyper-V, SQL, Azure, and Azure Stack. Rubrik Cloud Data Management was architected from the beginning to deliver cloud archival and data services. With Alta 4.1, Rubrik CloudOn instantiation services support both AWS and Microsoft Azure, providing "server-less" conversion of VMs to cloud instances. Now enterprises can use Rubrik CloudOn services to power-on applications in Azure, mobilizing them from data center to cloud. This lowers the costs of disaster recovery and test/development in the public cloud by utilizing cloud services only when needed. Rubrik is one of the first data management vendors to announce support for Microsoft's new private cloud offering, Azure Stack. With Rubrik, users can easily migrate applications from Azure Stack to Azure and accelerate test/development of on-premises and cloud applications. "With Rubrik, we could eliminate tape, integrate seamlessly with Microsoft Azure, and securely mobilize our applications from on-premises to the cloud," says Leonard De Botton, CIO at Berkeley College. "Together, Rubrik and Azure helps us decrease costs, stay within our data retention policy, and establish a stronger DR strategy." Integration with all Public Cloud Environments Rubrik has always been available in the private, hybrid or public cloud. Rubrik Alta 4.1 adds cloud archival support for all classes of Google Cloud Storage (Nearline, Coldline, Multi-Regional, and Regional). Enterprises can now orchestrate application data across a multi-cloud environment for long-term retention while retaining instant data accessibility through search. Support for all AWS Government Cloud Offerings and New Compliance Features Rubrik's platform enables federal government agencies and commercial contractors to securely manage data from creation to expiration across private and public clouds with end-to-end encryption. Rubrik supports Microsoft Azure's GovCloud and AWS GovCloud, and is an Advanced Tier Technology partner in the AWS Partner Network (APN). Rubrik Alta 4.1 broadens support to all government infrastructure offerings from AWS, including advanced secure services like Commercial Cloud Services (C2S). Rubrik 4.1 also adds support for Amazon Glacier, completing support for all storage classes of AWS – Amazon Simple Storage Service (Amazon S3), Standard-Infrequent Access (Standard IA), and Amazon Glacier. Customers can now automate data archival to Amazon Glacier to fulfill legal and compliance requirements. With 4.1, it only takes one click to enable the Amazon Glacier Vault Lock policy for Write-Once-Read-Many (WORM) locked archive on AWS for strict regulatory and compliance controls. Offering Multi-tenancy to Accelerate Secure, Self-service Delivery of Data Management Services Rubrik bolsters management security for multi-tenant organizations—such as Service Delivery Partners and large enterprises delivering IT-as-service — by adding secure, exclusive access to data management services in a hybrid cloud environment. With Rubrik 4.1, administrators can assign granular permissions to managed objects (e.g., assign Oracle DBs to Oracle Admin) to accelerate the self-service delivery of protection policies, recoveries, and analytics reporting across a multi-cloud environment. Rubrik Alta 4.1 is generally available through the company's global partner network.
Dell EMC Expands Commitment to Microsoft Customers with Converged Infrastructure Advancements
ORLANDO, Fla., Sept. 27, 2017 /PRNewswire/ -- @MICROSOFTIGNITE --  News summary @DellEMC XC Series offers new integration with Dell EMC data protection and #MicrosoftWindows2016 for simplified backup and hyper-converged infrastructure management Dell EMC approved to deliver #Microsoft #Azure through the Cloud Solution Provider (CSP) Direct and Indirect Microsoft partner model to offer end-to-end Microsoft Azure Stack hybrid cloud solutions directly to consumers and through channel partners globally Dell EMC Ready Bundle for Microsoft SQL Server and Dell EMC Microsoft Storage Spaces Direct Ready Nodes add performance improvements from Dell EMC PowerEdge 14th generation servers Full story Dell EMC announces broadened support of Microsoft data center environments with substantial updates across the Dell EMC XC Series, Dell EMC Cloud for Microsoft Azure Stack and Dell EMC Ready Bundle for Microsoft SQL Server. Addressing key IT transformation opportunities and use cases from hyper-converged infrastructure (HCI) to hybrid cloud and data analysis, these updated solutions demonstrate Dell EMC's continued commitment to innovate and spur successes for customers operating in Microsoft environments. "Dell EMC values the strong collaboration we have had with Microsoft for more than 30 years, which has resulted in world-class, innovative solutions delivered to customers worldwide," said Armughan Ahmad, senior vice president and general manager, Hybrid Cloud and Ready Solutions, Dell EMC. "The innovations we're announcing today are evidence of how our work with Microsoft has truly changed how our customers conduct their day-to-day activities, enabling them to gain greater value from their IT infrastructures and, more importantly, develop and deliver services to help achieve their ultimate business goals." Dell EMC XC Series Advances Integrations with Dell EMC Data Protection and Microsoft Windows 2016 The Dell EMC XC Series offers world-class HCI appliances for customers running a variety of virtualized workloads with their choice of hypervisors including Microsoft Hyper-V. As XC Series HCI environments are becoming increasingly popular for Microsoft workloads, including SQL, Microsoft Exchange and Microsoft SharePoint, new advancements to the XC Series include: Dell EMC XC Series Data Protection Management Console, which is launched from within the XC Series management interface and integrates and simplifies data protection operation and automation across XC Series deployments. This offers complete data protection capabilities with Dell EMC Avamar VE running on XC Series clusters and backup to Dell EMC Data Domain and cloud tiering to Virtustream and Microsoft Azure. IT admins gain the ability to quickly and easily monitor and manage backups for multiple XC Series clusters while utilizing best of breed Dell EMC technologies in one turnkey solution. Optimization for Microsoft Windows 2016 Hyper-V plus Azure, available at no additional cost to customers, including support for Windows Hyper-V 2016 with insights into XC Series clusters. The XC Series Azure Log Analytics Solution provides integration of XC Series into customer's OMS-based data center automation tools, enabling insights such as trend analysis and behavioral anomaly detection. Dell EMC Approved to Deliver Microsoft Azure through the Cloud Solution Provider (CSP) Direct and Indirect Model Worldwide Dell EMC also announces it has been approved by Microsoft to deliver Azure through the Cloud Solution Provider (CSP) Direct and Indirect Microsoft partner model worldwide. Based on Dell EMC's proven Azure and hybrid cloud platform global expertise, this enables the company to have the advantage of offering comprehensive, end-to-end Azure Stack hybrid cloud solutions, including Azure services and solutions, to both customers and channel partners globally. Dell EMC Cloud for Microsoft Azure Stack, introduced in May 2017, is a new, turnkey, hybrid cloud platform, delivering infrastructure and platform as a service and shipping now to early customers. It offers a fast and simple path for implementing and sustaining a trusted hybrid cloud platform based on Microsoft Azure Stack, delivering a consistent experience across the Azure on-premises cloud and on-premises through Azure Stack. Dell EMC Ready Solutions Add Improved Compute Performance and Storage Options Providing customers with more powerful compute performance capabilities, Dell EMC has added support for Dell EMC PowerEdge 14th generation servers to the Dell EMC Ready Bundle for Microsoft SQL Server and the Dell EMC Microsoft Storage Spaces Direct Ready Nodes. Additionally, the Dell Ready Bundle for Microsoft SQL Server also adds Dell EMC Unity storage, offering customers high performance and affordable flash optimized architecture that supports scores of VMs while delivering abundant input/output operations per second (IOPS) at low latency for running demanding Microsoft SQL Server workloads. Availability Dell EMC XC Series updates have planned global availability in Q4 2017. Dell EMC Ready Bundle for Microsoft SQL and Dell EMC Microsoft Storage Spaces Direct Ready Nodes updates have planned global availability in Q4 2017. Partner quote: Gavriella Schuster, corporate vice president, Worldwide Partner Group at Microsoft Corp. "Customers are adopting cloud-based solutions to drive digital transformation like never before, and we're thrilled to expand the capabilities for cloud partners under the Microsoft Cloud Solution Provider program. By joining the Microsoft Cloud Solution Provider program, our partners, such as Dell EMC, will deepen customer relationships and expand business opportunities in the cloud." Customer quotes: Morgan Jones, senior server administrator, Arizona State University "With our several XC Series clusters across three physical data centers, we've been able to save considerable administrative time and resources, while easily scaling, by simply dropping in new nodes as needs arise. Our appliances are hosting core Microsoft workloads, such as SQL databases, so the new support for Windows Hyper-V 2016 with one-click and rolling cluster upgrades is huge for us. This adds even more simplicity for a solution that already delivers on this promise." Yukihisa Kato, director, executive officer and general manager, Engineering Administrative Division, Mitsui Knowledge Industry Co., Ltd. (MKI) "Azure Stack's hybrid cloud model is a key transformation enabler for our own business and for customers, and Dell EMC continues to be a proven leader in providing highly reliable hybrid cloud platforms. The combination of Azure Stack with Dell EMC brings together innovative technology and market experience that we are leveraging to help drive our customers' business transformations." Ketil Neteland, advisor at Candidator "Dell EMC is the trusted partner of Candidator for Microsoft Azure Stack in Norway and Sweden. Together we bring innovation to our customers maximizing the potential with Azure Stack. Dell EMC's long history and extensive portfolio delivers the foundation for our business, and we see Azure Stack as strategic in the future to help our joint customers undertake their IT and digital transformations. Candidator is known for delivering innovative technology to their customers, and, with Azure Stack, will continue our promise to customers to be their partner and help them focus on their core business."
Cohesity Expands Integration with Microsoft Azure to Enable Data Protection Directly to the Azure Cloud Platform
SANTA CLARA, CA--(Marketwired - Sep 27, 2017) - Cohesity, a pioneer of hyperconverged secondary storage, today announced new capabilities that expand the power of its integration with #Microsoft #Azure by enabling its customers to protect data directly to the Azure cloud platform. The new functionality enables a #Cohesity Cloud Edition cluster to be deployed in Azure or Azure Government to back up on-premises applications and send all backup data directly to Azure. Customers benefit from Cohesity data management in Azure to reduce time to provision data for other use cases such as test/dev and analytics. The solution builds on the existing Cohesity and Microsoft joint solutions that empower customers to seamlessly manage all their secondary storage workloads across on-premises data centers and the Azure cloud platform. Click to Tweet: .@Cohesity Expands Integration With Microsoft @Azure to Enable Data Protection Directly to the Azure cloud: https://ctt.ec/3RfMN+ Cohesity DataPlatform with Azure delivers a hybrid cloud data fabric that makes data ubiquitously available between on-premises servers and the cloud to support backups, disaster recovery, application mobility, and test/dev through a single, radically efficient platform. The integrated platform seamlessly extends secondary data storage into Azure to benefit from the global scale and enterprise-grade security of Microsoft's public cloud platform. Enterprises have traditionally struggled to manage the rapidly increasing volume of secondary data stored across a multitude of legacy storage solutions. Built to handle one specific use case each, these point solutions result in complex storage silos that are difficult to manage, costly, and cannot scale to keep pace with accelerating data growth. With Cohesity, enterprises have the power to consolidate all secondary storage services on a unified, web-scale platform that extends across private, public, and government clouds, and is easily administered through a single user interface and can expand or contract to fit the company's needs at any given moment. "We are seeing growing customer demand for integration with Microsoft Azure to simplify hybrid cloud data management," said Patrick Rogers, head of marketing and product at Cohesity. "Our new solution will enable enterprises to accelerate their adoption of the Azure cloud and Cohesity for data protection use cases, with the ability to add other secondary storage workloads over time." Tad Brockway, general manager of Azure Storage at Microsoft, said, "This new capability makes it easier and more efficient for customers to back-up their on-premises application data to Microsoft Azure or Azure Government directly. Working with Cohesity, we are pleased to help simplify the onramp to hybrid cloud environments with the Cohesity and Microsoft software-defined storage solution." Ben Price, administrative and residential IT director for the University of California, Santa Barbara, said, "Replication performance to Microsoft Azure Government and CJIS compliance for use with police car and body cam video capture and storage was a key requirement for us. Cohesity's native cloud integration allowed us to seamlessly replicate and archive production data offsite to Azure and Azure Government for securely handling police videos." To learn more about the expanded solution from Cohesity and Microsoft, check out the new blog post on Cohesity's website. The new solution will be showcased in the Azure Storage PG booth at the Microsoft Ignite conference scheduled to take place Sept. 25-29 in Orlando, Florida.
Arista Leads the Cloud, Mind the ‘In-Sourcing,’ Says Berenberg
#Arista is better set up than competitors for an era of #cloudcomputing networking, argues Josep Bori of Berenberg, given its focus from the start on software. However, he does advise investors to keep an eye on the move to "in-sourced" software by Arista's own customers, such as #Microsoft, as a potential long-term risk. ByTiernan Ray Updated Sept. 26, 2017 2:32 p.m. ET  Facebook's Altoona data center, curtesy of Berenberg. Berenberg technology analyst Josep Bori today initiates on two cloud computing leaders, Arista Networks (ANET), and Equinix (EQIX), assigning both stocks a Buy rating and arguing that their respective leadership in their areas of specialty is becoming increasingly evidently. For Arista, to which he assigns a $225 price target, Bori assigns the honor of being "an emerging leader in cloud/datacentre networking,” and notes that what sets the company apart is its software, the “EOS” operating system. He notes the company’s founders figured out the edge of software back in 2004:  The company founders recognised back in 2004 that the next generation of cloud and enterprise datacentre networks would have significantly higher requirement in terms of performance, resilience, programmability, automation and pricing […] They also committed to building a hardware agnostic platform, avoiding the use of application specific integrated circuits (ASICs), which add materially to a traditional switch’s bill of materials. Its Hardware Abstraction Layer (HAL) allows EOS to run on a wide family of merchant silicon (eg Broadcom, Cavium and Fulcrum/Intel) and even on hypervisors and containers (ie virtualised hardware infrastructure). He likes the suitability of the product for the cloud. Bori features the diagram of a Facebook (FB) data center, at the top of this post, as an example of how Arista is better suited to cloud computing networking models. That gives Arista a nice growth market: We believe Arista Networks above-market growth is set to continue due to its large exposure to the high-growth segments of the IT market, where it is gaining market share, largely due to the reasons described in the prior section. Indeed, the company’s revenues grew 43% and 35% in 2015 and 2016 respectively, while the overall datacentre Ethernet switch market grew 5% and 13%, respectively. The software lets the company be better prepared for a scenario in which networking gradually moves from integrated boxes to being just a software sale, he thinks: We believe the company is better prepared than most of its peers for a potential business model transition from hardware appliances to software subscriptions, an ongoing industry trend, largely due to its 1) single image operating system across all its products, 2) which is hardware-independent by design, and 3) an already “unbundled” offering via its vEOS and cEOS for virtual machines and containers respectively. That said, Bori notes that there is a risk to #Arista from both software-only networking vendors, including "several pure software vendors in the market, such as #CumulusNetworks, #BigSwitch and #Pica8,” and also from its large cloud customers “in-sourcing” their own networking software. Bori goes through a lengthy discussion in particular of #Microsoft (MSFT), Arista’s biggest customer. Microsoft is developing something called “ #SONiC ,” its own networking software. Bori concludes that the threat of in-sourcing can’t be totally dismissed: In conclusion, while 1) it is encouraging that Microsoft’s use of Arista’s technology has continued to increase even after publicly announcing SONiC last year, 2) Arista Networks features prominently in its architectural slides, and 3) the level of internal IT and networking know-how required to develop a disaggregated switch on SONiC is likely only available to the very largest webscale vendors, it is also true that Microsoft knows a thing or two about developing operating systems and is currently working in its own Azure networking layer, which could potentially lead to a change in Arista’s relationship with its largest customer over the long term. Arista stock today is up $2.64, or 1.5%, at $183.47.
Datrium Announces Oracle Partnership And Oracle RAC Qualification
SUNNYVALE, Calif., Sept. 26, 2017 /PRNewswire/ -- #Datrium, the leading provider of #OpenConverged Infrastructure for hybrid clouds, today announced it has been named an #Oracle Gold Partner. The company also announced qualification of Oracle Real Application Clusters ( #RAC ) on Datrium #DVX, extending the administrative simplicity, robust availability and integrated backup benefits of its Open Convergence approach to customer's Oracle RAC deployments.  As IT organizations seek alternatives to expensive and inflexible array infrastructure for mission critical Oracle RAC deployments, they encounter three main challenges with hyperconverged infrastructure (HCI). When organizations need to roll out a small 2-node RAC deployment, HCI cannot offer a 2-node cluster. In order to provide the highest level of availability, the minimum cluster size for HCI is 5-nodes, which is subject to more than twice the Oracle licensing costs. There is also an availability mismatch between HCI and Oracle RAC. For example, RAC's built-in availability architecture ensures that a 4-node RAC deployment will continue to run even if three of the four nodes fail; whereas with HCI, should more than 2 nodes fail, the entire cluster fails. Finally, many HCI systems do not have the built-in copy data management to clone production databases for test/dev purposes with zero performance impact. Simplifying Oracle RAC Deployments That's all changing with Datrium's announced support for virtualizing Oracle RAC on its industry-leading Open Convergence platform, Datrium DVX. This new breed of converged infrastructure combines compute, flash-based primary storage, and integrated backup--now with support for Oracle RAC. With Datrium, customers can deploy mixed production application workloads and test/development workloads in one platform while maintaining isolation for optimal service levels. This unique approach to convergence addresses issues with Oracle RAC on HCI and gives customers a more compatible server-powered option for their deployments. "At Neovera, we have built our reputation on addressing the unique needs of our customers with unmatched customer service. We were looking to move off our legacy array infrastructure to a modern convergence solution for our Oracle RAC infrastructure that combined VM-level administration with robust availability and bare-metal performance," said Scott Weinberg, CEO and founder. "With Datrium, we got that and more, including end-to-end data security and an integrated backup platform which saves us a ton of time and money." With Datrium's support of Oracle RAC, scaling is configuration-free. Data services such as erasure coding, global deduplication and compression are always-on and there is no need to selectively configure performance versus efficiency for a given workload—with Datrium you get both. Adding another Oracle RAC node is as simple as cloning a virtual machine in VMware vCenter. And performance remains high with automatic alignment of I/O with the most active Oracle RAC node and VMware host, eliminating network reads and maintaining high performance. Robust Availability, Data Integrity and Security With Datrium DVX, servers remain stateless so where there are N servers, N-1 servers can fail and the DVX remains available. This is well-matched with Oracle RAC which is architected with the same availability model. In addition to high availability, Datrium DVX provides end-to-end data integrity checking for Oracle RAC, meaning the data written to the RAC node will always match the data written to both flash and secondary storage. Finally, DVX provides end-to-end data security by encrypting Oracle RAC data on host, in-flight across the network, and at rest on the DVX Data Node. Datrium DVX also provides Oracle RAC customers with powerful real-time VM analytics, providing instant insights at the database VM level, improving visibility, saving time and speeding results. Integrated Data Management for Oracle RAC Datrium DVX provides near instant recovery for Oracle RAC by providing consistent local and remote snapshot backups of RAC VMs across all VMware hosts, complimenting Oracle RMAN and Data Guard. Backups are cost effectively maintained on Datrium's Data Nodes, which store both hot and cold data in compressed and globally deduplicated form on secondary storage. Backups can also be archived to Amazon Web Services and managed from a single pane of glass within the DVX console. In addition to integrated backup, Datrium also accelerates development cycles for Oracle RAC developers with advanced copy data management functionality. Cloned copies of production databases are immediately available to software developers or QA engineers, speeding design, development and testing. "While HCI clusters are replacing some types of arrays, they are not a good match for the gold standard in enterprise data availability, Oracle RAC," said Brian Biles, Founder and CEO of Datrium. "Datrium's Open Convergence approach was designed for scalable host isolation and efficient cloud data management, so RAC is a great fit, and we're thrilled to be named an Oracle Gold Partner." Oracle Gold Partner Status Datrium has been awarded Oracle Gold Partner status, gaining access to extensive partner training, development and demonstration licenses, and other technical resources. With its Gold-level status, Datrium and Oracle can now collaborate effectively to provide joint customers with the best possible solution experience.
http://markets.businessinsider.com/news/stocks/Datrium-Announces-Oracle-Partnership-And-Oracle-RAC-Qualification-1002571979
IBM Unveils a New High-Powered Analytics System for Fast Access to Data Science
ARMONK, N.Y., Sept. 26, 2017 /PRNewswire/ -- IBM (NYSE: #IBM ) today announced the Integrated Analytics System, a new unified data system designed to give users fast, easy access to advanced data science capabilities and the ability to work with their data across private, public or hybrid cloud environments.  The system, which comes with a variety of data science tools built-in, allows data scientists to get up and running quickly to develop and deploy their advanced analytics models in-place, directly where the data resides for greater performance. And because it is based on the IBM common SQL engine, clients can use the system to easily move workloads to the public cloud to begin automating their businesses with machine learning. In fact, because the popular database engine is used across both hosted and cloud-based databases, users can move and query data across multiple data stores, such as the Db2 Warehouse on Cloud, or #Hortonworks Data Platform. At the heart of the Integrated Analytics System are the #IBM Data Science Experience, #Apache #Spark and the #Db2 Warehouse, all of which have been optimized to work together with straight forward management. The Data Science Experience provides a set of critical data science tools and a collaborative work space through which data scientists can create new analytic models that developers can use to build intelligent applications quickly and easily. The inclusion of Apache Spark, the popular open source framework, enables in-memory data processing, which speeds analytic applications by allowing analytics to be processed directly where the data resides. New to this class of offering are the machine learning capabilities that come with both the Data Science Experience and Spark embedded on the system. Having machine learning processing embedded means that data does not need to be moved to the analytics processing, reducing the associated processes and wait times for analytics to run and respond. This simplifies the process of training and evaluating predictive models, as well as the testing, deployment and training as it is all done in-place. "The combination of high performance and advanced analytics – from the Data Science Experience to the open Spark platform – gives our business analysts the ability to conduct intense data investigations with ease and speed," said Vitaly Tsivin, Executive Vice President, at AMC Networks, who has been testing the system for several months. "The Integrated Analytics System is positioned as an integral component of an enterprise data architecture solution, connecting IBM Netezza Data Warehouse and IBM PureData System for Analytics, cloud-based Db2 Warehouse on Cloud clusters, and other data sources." "Today's announcement is a continuation of our aggressive strategy to make data science and machine learning more accessible than ever before and to help organizations like AMC, begin harvesting their massive data volumes – across infrastructures – for insight and intelligence," said Rob Thomas, General Manager, IBM Analytics. Seamless Expansion to the Cloud The integrated architecture of the new system combines software enhancements such as asymmetric massively parallel processing (AMPP) with IBM Power® technology and flash memory storage hardware and builds on the IBM PureData System for Analytics, and the previous IBM Netezza data warehouse offerings. It also supports a wide range of data types and data services, including everything from the Watson Data Platform and IBM Db2 Warehouse On Cloud, to Hadoop and IBM BigSQL. Like these solutions, the Integrated Analytics System is built with the IBM common SQL engine, enabling users to seamlessly integrate the unit with cloud-based warehouse solutions. In addition, industry standard tools and the common SQL engine provide users with an option to also move these workloads seamlessly to public or private cloud environments with Spark clusters, based on the user's requirements. Like IBM's existing data warehouse products, the Integrated Analytics System is designed to provide built-in data virtualization and compatibility with Netezza®, Db2®, and IBM PureData System for Analytics. Among these capabilities, the new system also incorporates hybrid transactional analytical processing (HTAP). In contrast to typical business environments where transaction processing and analytics are run on distinct architectures, HTAP runs predictive analytics, transactional and historical data on the same database at accelerated response times. Later this year, the company plans to add support for HTAP with IBM Db2 Analytics Accelerator for z/OS, which will enable the system to transparently integrate with IBM z Systems infrastructures.
Alibaba beats Google for IaaS market share, with IBM out of sight
#Google is the world's number four #infrastructureAsAService vendor, according to analyst outfit Gartner's first ever attempt at calculating market share in the field. The infrastructure-as-a-service (IaaS) market is growing like a weed: it hauled US$22.1 billion through the door in calendar 2016, up from $16.8 billion in 2015. As the table below shows, that's 31.4 per cent growth across the sector. Gartner predicts that the rise of #Azure, #Google and #Alibaba will see #AWS experience “growth erosion in share … while other IaaS market leaders will see an increase in growth.” Smaller, non-hyperscale providers will “struggle to provide value through their services,” the firm predicts. Here's the tale of the tape. Vendor 2016 Revenue $m 2016 Market Share% 2015 Revenue $m 2015 Market Share% 2016-2015 Growth% Amazon 9,775 44.2 6,698 39.8 45.9 Microsoft 1,579 7.1 980 5.8 61.1 Alibaba 675 3 298 1.8 126.5 Google 500 2.3 250 1.5 100 Rackspace 484 2.2 461 2.7 5 Others 9,147 41.2 8,074 48.4 13.2 Total 22,160 100 16,861 100 31.4 Alibaba's strong showing reflects its dominance in China, Gartner says, adding that the company's recent opening of new data centres in Europe, Australia, the Middle East and Japan should help it to do better beyond the Middle Kingdom. IBM's absence from the top five isn't necessarily terrifying, as the company makes much of its SaaS and PaaS offerings. But missing the top five also means missing growth, because Gartner says it expects IaaS to outpace SaaS and PaaS for the next five years. The firm also says that much of IaaS expected growth will be “coming at the expense of the traditional, noncloud offerings.” Which means the servers, storage and switches on which #DellEMC, #HPE, #Lenovo and #Cisco rely for much of their revenue. ®
Yahoo is giving a critical piece of internal technology to the world -- just like it did with Hadoop
Yahoo is open-sourcing an internal tool called #Vespa, which it uses for content recommendations, ad serving, and executing certain searches. Vespa is arguably #Yahoo 's biggest open-source software release since #Hadoop in 2009, which formed the basis for two now-public companies, #Hortonworks and #Cloudera. Companies like #Amazon, #Facebook, and #Google could find it useful.
Oath, the #Verizon-owned parent company of Yahoo, is releasing for free some of its most important internal software, which the company has long used to make recommendations, target ads and execute searches.
The Vespa software solves a common but surprisingly difficult problem: quickly figuring out what to show a user in response to input, like when they type text into a box. Oath uses it in around 150 applications, including Flickr, Yahoo Mail and the main Yahoo search engine (specifically for components like entities, local results, images and answers to questions). It handles 3 billion native ad requests every day.
"The typical case is you don't know what you want to serve, but you have 20 billion pictures and you want to find the right ones," Jon Bratseth, a distinguished architect at Yahoo who led Vespa's development, told CNBC in an interview.
Vespa, which is now live on GitHub with an Apache 2.0 open-source license, can easily be added to different applications, making it suitable for use at big companies like Amazon, Facebook and Google that need to do different kinds of processing on different sets of data.
The release is the most important for Yahoo since it open-sourced the code for the Hadoop big data software in 2006. Hadoop has since come to be at the center of two public companies, Cloudera and Yahoo spin-off Hortonworks. Today people at lots of companies can contribute to technology that's still widely used at Yahoo, and build their own systems using Hadoop.
How Yahoo built it
Big tech companies regularly open-source their software. But if there's powerful software at the heart of a company's biggest revenue centers, it can take a while to come out into the open, and Vespa is no different.
Vespa dates back to the early 2000s. Yahoo already had web search technology, first through a partnership with Google and later through its 2002 Inktomi acquisition. What Yahoo didn't have was technology for delivering search results and recommendations on content that falls outside traditional web search results.
In 2003 Yahoo acquired Overture, which included its partner AltaVista as well as a lesser known search engine called AllTheWeb.com. After the deal, the roughly 30 AllTheWeb people were given a year to build software that could perform certain functions quickly before web pages were shown to end users. The system also needed to be easy to set up, run and tweak, so that it could be applied to a variety of applications without much trouble.
In around 2005, the AllTheWeb team worked with Yahoo's shopping team to adopt the new system. It required less management time, freeing up staffers to build new features.
"After that, we had a proven use case -- and that was a complicated one," Bratseth said. "More and more teams in Yahoo started using our system by themselves, because it made business sense. They would offload a lot of the problems they had to take care of themselves."
So Bratseth's team started expanding the powers of Vespa. They made it capable of handling input other than users' strings of text; over time it could also personalize content based on what users had clicked on in the past, which is valuable in cases when users haven't typed in anything. They also changed Vespa so that it could take direction from machine-learning algorithms.
https://www.cnbc.com/2017/09/26/yahoo-open-sources-vespa-for-content-recommendations.html
Cisco Cutting About 310 Headquarters Jobs Under Strategy Shift
#CiscoSystems Inc., the biggest maker of computer networking equipment, said it will cut about 310 jobs at its headquarters in San Jose, California. Cisco is trying to shift from a reliance on revenue from high-cost proprietary equipment -- which some customers are turning away from -- to software and services. The company has more than 73,000 employees, according to data compiled by Bloomberg. “Cisco regularly evaluates its business and will always make the changes necessary to effectively manage our portfolio and drive the most value for our customers and shareholders,” the company said in a statement. “As a result, this can mean realigning some areas so that we can invest in others such as security, data center/cloud and networking.”
Memory-Chip Maker Micron Crushes Fourth-Quarter Earnings Views
Memory-chip maker #MicronTechnology ( MU ) late Tuesday reported much better-than-expected sales and earnings in its fiscal fourth quarter ended Aug. 31, and guided analysts higher for the current quarter. The Boise, Idaho-based company earned $2.02 a share excluding items, reversing a year-earlier loss of 1 cent a share. Sales were $6.14 billion, up 91% year over year. Wall Street analysts expected earnings of $1.84 a share and revenue of $5.96 billion in the period. Micron stock was up more than 3% in after-hours trading on the stock market today . During the regular session, shares fell 2% to 34.18. For Micron's current fiscal first quarter, the company expects to earn $2.16 a share excluding items on sales of $6.3 billion, based on the midpoint of its guidance. In the same period last year, Micron earned 32 cents a share on sales of $3.97 billion. Wall Street has been modeling for $1.85 a share in non-GAAP earnings on sales of $6.06 billion. IBD'S TAKE: Micron stock has an IBD Composite Rating of 90, meaning it has outperformed 90% of stocks in key metrics over the past 12 months. For more analysis of Micron, visit the IBD Stock Checkup . Micron provides memory chips for PCs, smartphones, consumer electronics, data centers, enterprise storage, automotive and other applications. It gets about two-thirds of its revenue from DRAM chips, with the rest coming mostly from Nand data-storage chips. Micron benefited from "robust demand" for its memory and storage products as well as solid execution in the fourth quarter, Micron Chief Executive Sanjay Mehrotra said in a news release. "We expect healthy industry fundamentals to continue into 2018, supported by increasingly diverse end markets and applications," Mehrotra said. "We believe our focus on accelerating the deployment of advanced technologies and solutions will address our customers' evolving requirements, further strengthen our financial foundation, and enhance shareholder value."
http://m.nasdaq.com/article/memory-chip-maker-micron-crushes-fourth-quarter-earnings-views-cm851380
Will Cisco or Lenovo Scoop Up NetApp to Boost Their Storage Play?
#Cisco may be looking to scoop up #NetApp to boost its storage play, according to Summit Redstone analyst Srini Nandury.
Barron’s last week quoted Nandury as saying that Cisco is “likely to acquire NetApp rather than #PureStorage, given that NetApp will be immediately accretive ($5.8 billion revenue for 2018).”
#Huawei Rolls Out Universal CPE for SD-WANDell #EMC Ousts #Nutanix from #HyperconvergedInfrastructure Throne #DellEMC Adds Data Protection to HCI Appliances Running #Microsoft #HyperV
It might be a smart move for Cisco. Despite all of its recent software startup acquisitions, the networking giant doesn’t have a competitive data center storage portfolio. And it already partners with NetApp on its converged infrastructure system FlexPod. The latest version uses NetApp SolidFire all flash-storage.
Neither Cisco not NetApp would say whether a deal is in the works. Spokespersons from both companies said they “don’t comment on speculation and rumors.”
But as far as Silicon Valley rumors go, it’s not that far fetched.
IDC analyst Ritu Jyoti told SDxCentral that both Cisco and Lenovo are in the market for a storage play.
Both of these vendors are also looking to boost their hyperconverged infrastructure (HCI) market share. HCI is the fastest growing segment in the data center space, and software-defined storage plays a key role in this technology.
Cautionary (Whip)Tail
“Cisco has been toying around with this for a while,” Jyoti said.
Case in point: Whiptail, an all-flash array vendor that Cisco acquired for $415 million in cash. Cisco bought the company in 2013; the resulting product line is now dead.
“Whiptail was a complete disaster,” Jyoti said. “It was a combination of the technology not being up to the mark, and internally Cisco struggled with how to make the best go-to-market with that. Storage is a complex beast.”
But if Cisco wants to compete against Dell EMC, Hewlett Packard Enterprise, and IBM in the software-defined storage and data center infrastructure space, it needs to acquire a storage company, she added.
“Absolutely Cisco has to compete in this world and have a sustainable, competitive play in the data center market,” Jyoti said. “It needs a storage play, which is something that it’s badly missing.”
This will require more than just buying power. Cisco must do due diligence on the technology, as well as the sales force, go-to-market strategy, and transition plan.
From this perspective, both NetApp and Pure Storage are “definitely interesting plays,” Jyoti said. “My thinking is both Lenovo and Cisco are in the market to acquire someone. And who does what with which one is going to be determined by market cap, what they can digest, and what they can integrate.”
NetApp vs. Pure Storage
NetApp’s an attractive prospect. It’s all-flash storage arrays have been a major boon to business since the company acquired SolidFire, an all-flash storage vendor, for $870 million in 2016.
In August, NetApp reported better than expected earnings with net revenues for the first quarter topping $1.33 billion, up 2 percent year-over-year. CEO George Kurian said the company’s growth shows that NetApp’s “transformation” from a legacy storage appliance maker to a hybrid cloud company is working.
“NetApp is growing,” Jyoti said. “George Kurian has tried to turn things around. He has a positive, data-driven strategy. Customers speak very highly of NetApp. But having said that, NetApp has some legacy baggage.”
This included bringing the sales force up to speed on “talking the new language” and selling the product to the new data center buyer, she added. “Cisco or Lenovo, whomever picks them up, it can be a good acquisition but at the right price and with the right trimming strategy.”
Pure Storage isn’t quite as appealing. “It’s not the best product out there but it’s good enough,” Jyoti said. “It doesn’t have the baggage NetApp has, but it also doesn’t have the reputation NetApp has.”
Jyoti says her sources tell her both NetApp and Pure Storage are shopping themselves. “They are both going to be picked up. It’s just a matter of who gets what, with the right strategy at the right time.”
https://www.sdxcentral.com/articles/news/will-cisco-lenovo-scoop-netapp-boost-storage-play/2017/09/
Google Goes Tit for Tat With Amazon On Cloud Pricing
Last week, Amazon made a huge change in how it charges businesses for its cloud services, saying it would start to bill on a per-second basis starting Oct. 2 instead of by the hour. Now rival Google is also going to per-second increments, but is making the change effective immediately. In theory, smaller price increments could cut costs for some customers who don't use a full minute, or full hour, of the computing capacity they have paid for. Amazon Web Services, Google (GOOGL, +2.38%), and Microsoft have leapfrogged each other on pricing and new cloud computing features for several years now. So this is just the latest chapter in that saga as they fight to get businesses to put more of their data and run more of their software on their respective clouds.
#Google and #Microsoft started offering per-minute cloud computing in 2013 while #Amazon (AMZN, +1.30%), the largest and oldest of the cloud providers, held fast to per-hour pricing until now.
Google's new price model is for its basic computing units (which it calls virtual machines, or VMs) as well as its container engine and a few other offerings. And the price covers all VMs whether they run #WindowsServer, #RedHat #Linux or #SUSE Linux operating systems. Amazon's per-second pricing applies only to Linux, not to Windows. Google's pricing on "persistant disk" storage attached to these VMs has been billed per second for quite some time.
At this week's Ignite tech conference in Orlando, Microsoft (MSFT, +0.79%) took another route to price cuts by announcing Amazon-like reserved computing instances that give customers discounts (of up to 72% of of list price) if they lock into using those computing resources for one- or three-year periods.
http://fortune.com/2017/09/26/google-matches-amazons-price-change/
SonicWall Unveils Major Product Updates Across Its Entire Portfolio
CEO Bill Conner, in an exclusive interview with CRN, said the updates signal a "coming out party for the innovation of the new #SonicWall," a year after the company closed its spinout from #Dell and launched as an independent security vendor. "This is the largest set of products and software the company has delivered to market," Conner said. "We are pleased and excited." [Related: CRN Exclusive: SonicWall Exceeds Growth Goals, Partner Expectations After Split From Dell] The new product updates stretch across the #SonicWall portfolio. One update is the launch of #SonicOS 6.5, which Conner said represents a "overhaul" of the operating system around user experience, user interface, and next-generation capabilities. In total, the update added more than 50 new features across the operating system. The update also unifies the code base across SonicWall's TZ, SuperMassive and NSA appliances, something Conner said will make it easier for partners to manage the physical and virtual appliances for customers. That update also sets up the company's "API evolution" and Conner said the single code base would also help with SonicWall's ability to service partners and customers. SonicOS 6.5 will be available immediately to partners and customers in North America and Europe, the company said. [Sponsored Suggested Post: Free IoT Virtual Conference Learn how to monetize IoT solutions. Discover new IoT technologies. Build strategic partnerships.] SonicWall also announced the launch of a new firewall appliance, a SonicWall Network Security Appliance 2650 for the mid-tier, branch, and campus client. Conner said the new NSA 2650 firewall has 2.5-Gigabit Ethernet interfaces and 20 total ports; the specs match the speed and performance requirements for 802.11ac Wave 2 wireless networks. SonicWall also announced a new line of wireless access points that are compliant with 802.11ac Wave 2 wireless standards. Conner said SonicWall had engineered the access points with higher performance to allow for more security software, including the company's Capture service and encrypted traffic inspection. Conner said the price point would have a lower hardware cost than Aerohive and Ruckus, then will be in-line with Meraki based on volume as it includes both hardware and a subscription service. "Think of this as the next-generation endpoint for wireless and mobile networks," Conner said of the two technology launches. "That's really where we're going with this." SonicWall's Executive Director of Product Management Dmitriy Ayrapetov said the updates would also help "future-proof" the company's technology for new wireless standards. SonicWall said the new NSA 2650 firewall and SonicWave series access points are available immediately in North America and Europe. SonicWall also made changes to its Global Management System, including an update to its user interface and user experience. It also fully launched its SonicWall Cloud Global Management System, including a new SonicWall Cloud Analytics offering for analytic, forensics, and investigative capabilities on top of the company's firewalls and access points. Finally, SonicWall announced updates around mobile security, including the launch of SonicWall Secure Mobile Access 12.1. The update brings SonicWall's Capture service to mobile devices, as well as federated single sign-on to secure mobile access. SonicWall said the SMA 12.1 and Cloud Analytics offerings would be available in North America and Europe in the fourth quarter.
Tuesday, September 26, 2017
Intel unveils an AI chip that mimics the human brain
Lots of tech companies including #Apple, #Google, #Microsoft, #NVIDIA and #Intel itself have created chips for #imagerecognition and other #deeplearning chores. However, Intel is taking another tack as well with an experimental chip called " #Loihi " Rather than relying on raw computing horsepower, it uses an old-school, as-yet-unproven type of " #nueromorphic " tech that's modeled after the human brain. Intel has been exploring neuromorphic tech for awhile, and even designed a chip in 2012. Instead of logic gates, it uses "spiking neurons" as a fundamental computing unit. Those can pass along signals of varying strength, much like the neurons in our own brains. They can also fire when needed, rather than being controlled by a clock like a regular processor. Intel's Loihi chip has 1,024 artificial neurons, or 130,000 simulated neurons with 130 million possible synaptic connections. That's a bit more complex than, say, a lobster's brain, but a long ways from our 80 billion neurons. Human brains work by relaying information with pulses or spikes, strengthening frequent connections and storing the changes locally at synapse interconnections. As such, brain cells don't function alone, because the activity of one neuron directly affects others -- and groups of cells working in concert lead to learning and intelligence.  By simulating this behavior with the Loihi chip, it can (in theory) speed up machine learning while reducing power requirements by up to 1,000 times. What's more, all the learning can be done on-chip, instead of requiring enormous datasets. If incorporated into a computer, such chips could also learn new things on their own, rather than remaining ignorant of tasks they hasn't been taught specifically. These types of chips would give us the sort of AI behavior we expect (and fear) -- namely, robots and other devices that can learn as they go. "The test chip [has] enormous potential to improve automotive and industrial applications as well as personal robots," Intel says. That all sounds good, but so far, neuromorphic chips have yet to prove themselves next to current, brute-force deep-learning technology. IBM has also developed a neuromorphic chip called "TrueNorth," for instance, with 4096 processors that simulate around 256 million synapses. However, Facebook's deep learning specialist Yann LeCun said that chip wouldn't easily be able to do tasks like image recognition using the NeuFlow convolution model he designed. Intel has also admitted that its neuromorphic chip wouldn't do well with some types of deep-learning models. Via its acquisition of Movidius and MobilEye, however, it's already got a line of machine vision and learning chips that do work with current AI algorithms. It also acquired a company called Nervana last year to take on AI cloud processing leader NVIDIA. For Loihi, it plans to give the chips to select "leading university and research institutions" focused on artificial intelligence in the first half of 2018. The aim is test the chip's feasibility for new types of AI applications to boost further development. It will build the chips using its 14-nanometer process technology and release the first test model in November.
https://www.engadget.com/2017/09/26/intel-loihi-neuromorphic-chip-human-brain/
Splunk Trains Workforce of Tomorrow With Amazon Web Services, NPower, Wounded Warrior Project and Year Up
SAN FRANCISCO & WASHINGTON--(BUSINESS WIRE)--.conf2017 – #Splunk Inc. (NASDAQ: SPLK), first in delivering “aha” moments from machine data, today announced it is helping military veterans and youth train for careers in technology through its Splunk4Good initiative and partnerships with nonprofit organizations NPower, Wounded Warrior Project (WWP) and Year Up as well as #AWS re:Start. Splunk is also expanding its nonprofit license program to offer free or discounted licenses to more nonprofit organizations. The announcements are the latest in the expansion of Splunk Pledge, Splunk’s commitment to donate over $100 million in Splunk software licenses, training, support, education and volunteerism to nonprofit organizations and educational institutions over 10 years. “Data analytics through Splunk enables businesses to grow and succeed, and now Splunk is enabling individuals, diverse communities, nonprofits and educational institutions around the world to similarly succeed. Splunk, together with our partners, is using machine data to change the world.” Tweet this Through partnerships with NPower, WWP’s Warriors to Work Initiative and Year Up, U.S.-based youth and military veterans will have free access to Splunk licenses and Splunk’s extensive education resources. There is no limit to the number of military veterans and youth members of NPower and Year Up who can access the training. Military veterans, former service members, and their families will be validated through a partnership with Splunk customer ID.me. Splunk expects to add additional workforce training partners this year. U.S. Veterans can sign up at https://veterans.splunk.com and members of NPower and Year Up can sign up at https://workplus.splunk.com. “Splunk is helping Wounded Warrior Project place veterans on the path to cutting-edge careers by equipping them with tangible and marketable skillsets,” said Lt. Gen. (Ret.) Mike Linnington, CEO, WWP. “We are excited to partner with Splunk through our Warriors to Work initiative because careers requiring analytics and cyber security skills are in high demand, and there are thousands of available opportunities available globally. Partners like Splunk help us to better connect, serve and empower wounded warriors every day.” As part of Splunk’s commitment to help close the technology skills gap, Splunk is also working with AWS re:Start, a UK-focused military veteran and young adult training program that harnesses the AWS Partner Network (APN) to match veterans and youth with jobs. As part of its global work with AWS, Splunk will provide opportunities to graduates of this program to continue their cloud computing and data analytics education. Splunk Pledge Makes a Global Impact Since Splunk announced Splunk Pledge at .conf2016, organizations including Global Emancipation Network, TeamArrow and Team Rubicon have benefitted from free Splunk® Enterprise licenses. Their success inspired Splunk to expand its nonprofit licensing program. Splunk is now offering free, 10GB term licenses with elearning and standard support to any nonprofit organization in the world. Nonprofits seeking a larger license qualify for discounted pricing on Splunk Enterprise, Splunk Enterprise Security, Splunk IT Service Intelligence and Splunk Cloud. All nonprofit organizations, including existing qualifying Splunk customers, may apply through the Splunk Pledge website. “Splunk is a critical enabler for Global Emancipation Network to identify and help prevent trafficking around the world,” said Sherrie Caltagirone, founder and executive director, Global Emancipation Network. “We have a tough mission that necessitates fast information gathering and fast action. Splunk has the ideal machine data expertise from its IT and security customers to help us problem solve quickly and creatively, whether by using Splunk to produce simple cell phone alerts for law enforcement, or by correlating seemingly disparate datasets of phone numbers and advertisement information to discover trafficking rings. Splunk is truly enabling us to solve one of the world’s biggest problems.” “I am deeply proud of Splunk Pledge as we work with the community around us to drive awareness and delivery of education and access to information. Among many successes, Splunk Pledge has already helped nonprofits use analytics to combat human trafficking, optimize solar power in transportation and accelerate humanitarian and disaster response,” said Doug Merritt, President and CEO, Splunk. “Data analytics through Splunk enables businesses to grow and succeed, and now Splunk is enabling individuals, diverse communities, nonprofits and educational institutions around the world to similarly succeed. Splunk, together with our partners, is using machine data to change the world.” Learn more about Splunk Pledge on the Splunk website and how your favorite cause can make a big difference with big data.
http://www.businesswire.com/news/home/20170926005331/en/Splunk-Trains-Workforce-Tomorrow-Amazon-Web-Services
BlueData Provides Ability to Deploy Containerized Hadoop and Spark for Dell EMC Elastic Data Platform
SANTA CLARA, CA--(Marketwired - Sep 26, 2017) - #BlueData®, a leading #BigDataasAService company, announced that #DellEMC is using BlueData EPIC software to help power its new Elastic Data Platform. Using BlueData EPIC and the #DellEMCElasticDataPlatform, customers can spin up instant clusters for #Hadoop, #Spark, and other Big Data tools running on #Docker #containers. "We've been working together with Dell EMC across our product, services, and sales teams to help our joint customers get the most out of their Big Data investments," said Kumar Sreekanti, CEO of BlueData. "This latest collaboration is further evidence that our software has become the solution of choice for containerized Big Data deployments in the enterprise." The Dell EMC Elastic Data Platform provides on-demand access to Big Data analytics and data science workloads (e.g. Hadoop, Spark, machine learning, and other use cases) in a highly scalable, flexible, and secure multi-tenant architecture. The platform delivers fast and easy provisioning, simplified deployments, cost-efficiency, and assurance that governance and security requirements are being met for Big Data. The Dell EMC Elastic Data Platform includes Dell EMC infrastructure, software from BlueData as well as BlueTalon for data-centric security, and Dell EMC Professional Services: Deploying Big Data Environments: BlueData provides the ability to quickly create elastic, multi-tenant Big Data environments for data science and analytics using Docker containers -- for Big-Data-as-a-Service whether on-premises, in the cloud, or in a hybrid architecture. Separating Compute and Storage: When aggregate datasets grow larger than a few hundred terabytes, it makes sense to separate compute from storage to allow both to scale independently. For enterprises needing scale out storage, Dell EMC Isilon offers a compelling ROI and ease of use with scalability. Enforcing Centralized Policy: BlueTalon provides the consistent creation and enforcement of data access policies across environments supporting a diverse set of users, tools and data systems. Automating and Integrating: Dell EMC Professional Services have automated the deployment of the above components and provided an open and flexible set of interfaces to integrate into existing Big Data environments. To learn more about the Elastic Data Platform and BlueData's collaboration with Dell EMC, visit the BlueData booth (# 433) at the Strata Data Conference in New York City this week, September 26th to 28th.
Here’s the Complicated Back Story of Microsoft and HPE’s Latest Collaboration
#Microsoft is betting that new data center hardware from long-time partners #HewlettPackardEnterprise, #Dell, and #Lenovo will boost its #Azure cloud against rival #Amazon These machines, which Microsoft execs said are now available after a long buildup, will let businesses run Microsoft Azure software in their own server rooms and connect it as needed to Microsoft data centers for additional computing or storage capacity. Pairing on-premises computing with cloud data centers is an example of what techies call "hybrid cloud," and is seen by most tech vendors—other than Amazon—as a key advantage over Amazon, which is mostly about moving customer data and applications into AWS facilities.  CHANGE THE WORLD Microsoft Puts Your Data in a Box For Easy Shipment to Azure Cloud  COMPARECARDS This High Paying Cash Back Card is Taking The Market By Storm. Get Yours Now. SPONSORED  PAID CONTENT The Future Is Coming Faster Than Ever From COMCAST  CHANGE THE WORLD This Is How Netflix Wants to Improve In-Flight Wi-Fi on Major Airlines  CHANGE THE WORLD Bill Gates Uses an Android Phone  CHANGE THE WORLD Apple Is Making a Hefty Profit on Each iPhone 8 Sale  CHANGE THE WORLD The Good and Bad in Apple macOS High Sierra: What You Need to Know  CEO INITIATIVE Why JPMorgan Chase Is Putting $10 Million Into Two Neighborhoods in Washington, D.C.  CHANGE THE WORLD No, Apple iPhone X Isn’t in Trouble  CHANGE THE WORLD Microsoft Plans to Kill Skype for Business in Favor of Teams HPE (HPE, +1.36%) said Monday that its ProLiant for Microsoft Azure Stack server is now available. When HPE previously announced the product in July, it pegged prices at between $300,000 and $400,000 depending on customer configuration. Dell Technologies disclosed plans Azure Stack hardware in May, and will talk more about its entry at the Microsoft Ignite tech conference in Orlando on Wednesday but a spokesman said list price for Dell EMC Cloud for Microsoft Azure Stack starts at $265,000, including implementation services.  Lenovo confirmed that its Azure Stack box is now available, but did not provide pricing. Related: Microsoft Again Touts Azure and Apps In its Battle With Google and Amazon That Microsoft (MSFT, +0.04%) relies on hardware makers, also known as original equipment manufacturers or OEMs, to propagate its software is nothing new. Microsoft built its lucrative Windows and Office franchises by teaming with HPE's parent company HP Inc., Dell Technologies' forebear Dell Inc., and IBM (IBM, +0.51%). Those companies sold desktop and laptop PCs pre-installed with Windows, and often Office, to consumers and businesses alike. Microsoft partnered with the same companies to push Windows Server software into corporate data centers.  For the record, IBM sold its PC business to Lenovo in 2004 and its Intel (INTC, +1.18%)-based server business to the same company ten years later. Get Data Sheet, Fortune’s technology newsletter. CHANGE THE WORLD Here’s How Much Apple iPhone 8 Glass Repairs Are Going to Cost You But Microsoft's rapport with these hardware allies has long been tense and became more fraught over the past decade as the software giant itself moved into the hardware business, designing Surface tablets and laptops and offloading their production mostly to low-cost contract manufacturers.  Related: Here's What LinkedIn Could Mean for Microsoft's Cloud Thus, Microsoft Surface devices competed directly with laptops from—you guessed it—Dell, HP (HPQ, +0.15%), and Lenovo. At roughly the same time, Microsoft, along with Facebook (FB, +0.84%), Google (GOOG, +0.26%), Amazon (AMZN, -0.10%), and other Web-based companies which were gearing up their own massive data centers, started designing their own servers, and switching to hardware built by contract manufacturers. That put a dent in the sales of name-brand server and networking hardware from companies like HPE, Lenovo, and Dell along with Cisco (CSCO, +0.30%).  Over the summer, HPE chief Meg Whitman repeatedly warned that the company needs to assess whether it can keep supplying low-end servers to cloud vendors. In one quarter, she blamed the fall-off in sales of such servers on one major buyer, which was later "outed" as being Microsoft. Related: Welcome to the Era of Great Data Center Consolidation HPE, Dell, and Lenovo need to sell their gear—and related services—to corporate IT departments; Azure Stack gives them a way to do so while also combating the existential threat most of these companies see in AWS.  CHANGE THE WORLD Microsoft Wants to Save the World With Quantum Computing  COMPARECARDS This High Paying Cash Back Card is Taking The Market By Storm. Get Yours Now. SPONSORED  PAID CONTENT Finding Your Passion In Your Profession From KPMG  CHANGE THE WORLD Apple Might Have an iPhone 8 Sales Problem  CHANGE THE WORLD How to Download and Install Apple macOS High Sierra  CHANGE THE WORLD Microsoft Will Again Tout Azure and Apps in Battle With Amazon and Google  CHANGE THE WORLD Toshiba Says the $18 Billion Chip Sale Is Delayed Because of Apple  CHANGE THE WORLD How Apple Will Stop Companies Abusing Facial Recognition on New iPhone X  CHANGE THE WORLD Apple iPhone 8 Launches in the iPhone X’s Big Shadow  CHANGE THE WORLD Ford Expands Test of Microsoft HoloLens To Speed Car Design Analyst Patrick Moorhead, president of research firm Moor Insights & Strategy, summed it up: "While hardware makers would like to run inside Azure Public Cloud, the next best thing is to be on customer premises. Azure Stack gives them a way to do that." Gartner (IT, +0.32%) vice president Ed Anderson agreed: Azure Stack gives hardware partners a play in a market from which they would otherwise be excluded, he said via email. "Microsoft is effectively throwing them a bone. I think the challenge for the OEMs is to demonstrate real value in this relationship, which means driving Azure adoption and increased utilization. If the hardware OEMs get in the way of this primary goal, or slow down the process in any way, Microsoft is likely to build its own system and once again bypass their partners."
http://fortune.com/2017/09/26/hpe-microsoft-azure-stack-hardware/