Dell, EMC, Dell Technologies, Cisco,

Tuesday, January 31, 2017

Dell EMC Puts Pressure On Partners To Boost Sales, Maintain Tier Status

#DellEMC is putting the pressure on solution providers to prove they belong in their assigned tiers in the company's new unified partner program. Dell EMC's decision to "status match" solution providers into the program – top tier partners from the legacy Dell and EMC programs become top tier partners in the new program, and so on – sends some solution providers racing to boost revenue to maintain that tier status.  "New tier status is not automatic," Kimberly DeLeon, head of Dell EMC channel programming, told partners recently. In Dell EMC's 2019 fiscal year, which begins in February 2018, tier eligibility will be based on new revenue targets and training requirements, she said.
http://m.crn.com/news/channel-programs/300083580/dell-emc-puts-pressure-on-partners-to-boost-sales-maintain-tier-status.htm?itc=hp_ots


Cloudera Announces General Availability of Apache Kudu with Release of Cloudera Enterprise 5.10

Palo Alto, Calif., Jan. 31, 2017 (GLOBE NEWSWIRE) --  #Cloudera, the global provider of the fastest, easiest, and most secure data management, analytics and machine learning platform built on the latest open source technologies, today announced that #Apache #Kudu, the open source software ( #OSS ) storage engine for fast analytics on fast moving data, is now shipping as a generally available component within Cloudera Enterprise 5.10. Kudu simplifies the path to real-time analytics, allowing users to act quickly on data as-it-happens to make better business decisions.

http://finance.yahoo.com/news/cloudera-announces-general-availability-apache-120000890.html

Startup Raises $165 Million To Take Microsoft, HPE Customers In $34 Billion Market

In the battle for a single market, being big can cost you customers and being small can help you gain market share. And the competitive success of one small company in a $34 billion market does not bode well for investors in two big rivals. More specifically, #Microsoft and #HewlettPackardEnterprise are losing customers to Los Altos, Calif. and Vienna, Austria-based automated software testing tool supplier, #Tricentis in the $34 billion software testing market. (I have no financial interest in the companies mentioned in this post).

http://www.forbes.com/sites/petercohan/2017/01/30/startup-raises-165-million-to-take-microsoft-hpe-customers-in-34-billion-market/#5f932e6c6306

Oracle effectively doubles licence fees to run its stuff in AWS

#Oracle has changed the way it charges users to run its software in #AmazonWebServices, effectively doubling the cost along the way. Big Red's previous licensing regime [PDF] recognised that #AWS 's virtual CPUs were a single thread of a core that runs two threads. Each virtual CPU therefore counted as half a core. That's changed: Oracle's new cloud licensing policy [PDF] says an AWS vCPU is now treated as a full core if hyperthreading is not enabled. A user renting two AWS vCPUS therefore needs to pay full freight for both, effectively doubling the number of Oracle licences required to run Big Red inside AWS. And therefore doubling the cost as well. The new policy also says: “When counting Oracle Processor license requirements in Authorized Cloud Environments, the Oracle Processor Core Factor Table is not applicable.” That table [PDF] says Xeon cores count as half a licence. Making the table inapplicable to the cloud again doubles the licence count required. The Register learned of the change from this post by Oracle-watcher Tim Hall. Pieter Jansen, owner of Navicle, an Australian Oracle licensing consultancy, confirmed Hall's analysis and the effect on price price to El Reg. “The unit cost does not change,” Jansen said, adding that he's yet to hear whether Oracle will grandfather previous arrangements for current users, who face larger bills if Big Red does not cut them a break. The new policy also makes AWS and Azure equal in Oracle's eyes: the latter now needs one licence per core too. Jansen suspects the decision was made to make Oracle's own cloud a more attractive proposition. He may be onto something, as Oracle last year named AWS as public enemy number one in the cloud, claiming its own cloud will be cheaper and faster than the Amazonian offering. These new licensing arrangements certainly look to be taking care of the cheaper side of the deal. The Register has asked Oracle why it has made this change and how it will treat existing customers, but was told a response would not be possible until Oracle USA opens its doors this week. We'll update this story, or write a new one, if we receive a response. ®
https://www.theregister.co.uk/2017/01/30/oracle_effectively_doubles_licence_fees_to_run_in_aws/

Scale Computing Launches First Multi-Tier Channel Program In Massive Midmarket Hyper-convergence Blitz

Fast-growing #hyperconverged virtualization appliance maker #Scale Computing is rolling out its first multi-tier channel program as part of a massive mid-market offensive. Scale Computing CEO Jeff Ready said the channel expansion will boost the company's fervent following among small- and mid-size business customers anxious to avoid what he calls exorbitant #VMware licensing fees.

http://m.crn.com/news/data-center/300083585/scale-computing-launches-first-multi-tier-channel-program-in-massive-midmarket-hyper-convergence-blitz.htm

Red Hat Assists Monash University with Deployment of Software-Defined Storage to Support Advanced Research Capabilities

RALEIGH, N.C. — January 30, 2017 — #RedHat, Inc. (NYSE: RHT), the world's leading provider of open source solutions, today announced that Monash University, one of Australia’s most prestigious research universities, has implemented a massive multi-petabyte deployment on Red Hat #Ceph Storage.

https://www.redhat.com/en/about/press-releases/red-hat-assists-monash-university-deployment-software-defined-storage-support-advanced-research-capabilities

The Most Controversial Quantum Computer Ever Made Just Got an Upgrade

QUANTUM COMPUTING #Quantumcomputers are, without a doubt, the future, of computing. And while there are dozens of studies trying to make quantum computers more practically usable for the average person, we have yet to create a quantum computer that works in the same way our current computers do. This inability is laregly due to the difficulties in sustaining qubits (or quantum bits, a unit of quantum information), the most important part of quantum computing. There are a few companies, however, that have shown that it is, at least, possible to create a working quantum computer. One such company is D-Wave, and this year they released the latest version of their quantum computer. It’s called the 2000Q System, with, you guessed it, 2000 qubits in its processor. This number surpasses its 1,000-qubit predecessor. It’s the Canadian company’s most capable machine yet. Some criticize D-Wave’s machine, saying it isn’t really a quantum computer, mostly because their qubits aren’t built in the same way that would be for traditional quantum computers. As such, these qubits are more fragile and offer less precision in manipulation. Moreover, D-Wave’s quantum computers can only solve one type of computing task known as optimization problems through a process called quantum annealing — basically, it’s how magnetic fields that represent the problem nudge qubits in superposition towards a new state that provides the best configuration to solve a given problem.
https://futurism.com/the-most-controversial-quantum-computer-ever-made-just-got-an-upgrade/

Streaming Live Data and the Hadoop Ecosystem

Oleg Zhurakousky discusses the #Hadoop ecosystem – Hadoop, #HDFS, #Yarn -, and how projects such as #Hive, #Atlas, #NiFi interact and integrate to support the variety of data used for analytics.

https://www.infoq.com/presentations/streaming-hadoop

Hadoop vendors make a jumble of security

A year ago a Deutsche Bank survey of CIOs found that “CIOs are now broadly comfortable with [ #Hadoop ] and see it as a significant part of the future data architecture.” They're so comfortable, in fact, that many CIOs haven’t thought to question Hadoop’s built-in security, leading Gartner analyst Merv Adrian to query, “Can it be that people believe Hadoop is secure? Because it certainly is not.” That was then, this is now, and the primary Hadoop vendors are getting serious about security. That’s the good news. The bad, however, is that they’re approaching Hadoop security in significantly different ways, which promises to turn big data’s open source poster child into a potential pitfall for vendor lock-in.

http://www.infoworld.com/article/3162399/analytics/hadoop-vendors-make-a-jumble-of-security.html

Microsoft is cooking virtual storage in Azure

#Microsoft is prepping something to do with virtual storage appliances spanning on-premises storage and #Azure and will reveal all in April. This story starts with an early January postannouncing that #SoftNAS can now offer its virtual NAS devices across on-premises storage and Azure. That post piqued The Register's interest, because we are aware of no other virtual storage vendor that can play that way in #Azure. SoftNAS is not a storage industry titan but we figured that if Microsoft sees value in virtual storage from a minnow, perhaps it is also open to the idea of working with more established storage players. Our inquiries about Redmond's intentions in this field did not yield a solid answer, but elicited a promise that Microsoft will have something to say on the matter in April. What might that something be? Virtual storage is a nascent field: #VMware last week claimed leadership in the market with a US$300m run rate, and even that figure was rubbery because it included #hyperconverged kit. In any case, we can discount the idea of VSAN stretching out into Azure, because Microsoft is unlikely to nourish a competitor in that way. This leaves us with Microsoft's cloudy allies and therefore to #HPE, which has the Left Hand virtual storage appliance and a stated commitment to love Azure forever. Also, let's not forget Microsoft's work with #NetApp to better virtualize #FreeBSD, as NetApp's Data #ONTAP is a heavily customised BSD. And NetApp has ONTAP Edge, the virtual version of ONTAP. Whatever's coming down the pike, the mere fact that Azure is already capable of becoming part of a hybrid virtual NAS shows storage is sure to be challenged by all manner of cloudy concoctions. ®

https://www.theregister.co.uk/2017/01/29/microsofts_cooking_virtual_storage_in_azure/

Dell EMC RecoverPoint for Virtual Machines

Enable quick recovery of #VMware virtual machines to any point in time. #DellEMC #RecoverPoint for Virtual Machines provides continuous data protection (CDP) for operational recovery and disaster recovery. You’ll manage your VM protection simply and efficiently. vAdmins and enterprise application owners can set and manage their VM data protection through a plug-in to VMware vCenter. Automated provisioning and DR orchestration make it easier to meet your recovery point objectives (RPOs) and recovery time objectives (RTOs). RecoverPoint for Virtual Machines is hypervisor-based, software-only data replication that integrates with VMware vCenter (customer supplied). Key features: Protect VMware virtual machines with granular recovery to the VM level Use orchestration to enable test, failover, and failback to any point in time Replicate VMs (VMDK and RDM) locally and remotely Support replication policies over any distance: synchronous, asynchronous, or dynamic Use consistency groups for fast, application-consistent recovery of VMs Optimize WAN bandwidth use with data compression and deduplication Support any storage array on the VMware hardware compatibility list (HCL) Manage data protection using the familiar vSphere Web Client user interface
https://store.emc.com/us/Product-Family/EMC-RecoverPoint-Products/Dell-EMC-RecoverPoint-for-Virtual-Machines/p/EMC-RecoverPoint-VM

Sunday, January 29, 2017

Skylake Xeon Ramp Cuts Into Intel’s Datacenter Profits

Every successive processor generation presents its own challenges to all chip makers, and the ramp of 14 nanometer processes that will be used in the future “ #Skylake ” Xeon processors, due in the second half of this year, cut into the operating profits of its Data Center Group in the final quarter of 2016. #Intel also apparently had an issue with one of its chip lines ­– it did not say if it was a Xeon or Xeon Phi, or detail what that issue was – that needed to be fixed and that hurt Data Center Group’s middle line, too. Still, despite a slowdown in spending among enterprises for new compute capacity, Data Center Group still turned in a record year for revenues and profits, thanks to growth in #cloud, #hyperscaler, #telecom, and service provider spending on compute capacity, even if the growth rate was only about half of the 15 percent sustained target rate that the company set for the Data Center Group a few years back. With Intel having such dominant share of the server market – Xeon and Xeon Phi chips commanded 89.2 percent of server revenues and 99.3 percent of server shipments in the third quarter of 2016 – it is no surprise at all that Intel now rises and falls directly with what is going on among large enterprises, cloud builders, hyperscaler, telcos and other service providers, and small and medium businesses. There just is not that much growth that can come from taking share away from #IBM System z mainframe and Power Systems and the remaining bases of Itanium, Sparc, Sparc64, Alpha, and other processors and the ARM server base is still nascent and is no opportunity for Intel to get new sales from. In the quarter ended in December, Intel’s overall sales were $16.37 billion across all of its product lines, up 9.8 percent from the year-ago period; net income was $3.56 billion, down 1.4 percent. This drop in earnings was not just due to the 14 nanometer ramp with the Skylake Xeons, but also due the ramp of #3DNAND flash memory chips as well as for Optane 3D XPoint memory for SSDs. There were also some costs related to restructuring charges for the layoffs of 15,000 employees that Intel announced last summer as it moved investments away from its Client Computing Group, which makes chips for PCs, laptops, and tablets, and towards new technologies such as 3D NAND flash, 3D XPoint memory, silicon photonics, rack-scale computing, and Omni-Path switching. Intel’s bottom line was squeezed by the ramp of 10 nanometer processes for “Kaby Lake” Core processors, due at the end of this year and their follow-ons, “Cannon Lake,” which will ultimately be re-etched as Xeon server chips, and the work Intel is doing on 7 nanometer chip manufacturing. And finally, Data Center Group’s profits were also impacted by an unspecified intellectual property cross-licensing and patent deal that Intel did in the quarter with an unnamed communications player. In the final quarter of last year, revenue for platforms at Intel in the Data Center Group – meaning processors, chipsets, motherboards and in some cases systems – came to $4.31 billion, up 7.3 percent, while Other revenue (which is not specified but which should include Omni-Path networking, RackScale architecture, and other stuff) amounted to $362 million, up 23.1 percent. But because of all of these issues above, operating income for Data Center Group fell by 13.5 percent to $1.88 billion. It is a bit strange that Intel did not announce the intellectual property deal when it happened, but the size of the deal may not have been, in and of itself, material to the company’s financials. As for the chip issue in the Data Center Group, Intel’s chief financial officer, Bob Swan, did not elaborate, except to say that this had a greater impact on the numbers and that there were higher than expected failure rates on unspecified components shipped to some high-end customers and that it had found a fix for the issue and set up a reserve to cover the costs for replacement parts. Neither of these issues, said Swan, would affect the books in 2017. The big issue for Intel in 2016 was the shift from the 22 nanometer “Haswell” Xeon processors to the 14 nanometer “Broadwell” parts, which provide a bump in performance and price/performance but which also, at least at first, incur higher costs as the manufacturing process ramps. The real problem is that enterprises, which now account for less than half of the revenues within Data Center Group, are cutting back on spending, and this might be a little disconcerting but it is absolutely predictable given the current global political and economic climate and given that enterprises are moving more and more workloads to public clouds or using services from hyperscalers and cloud builders where they might have otherwise built them and hosted them – and much less efficiently – in their own datacenters. Sales of chippery to cloud builders (which includes what we at The Next Platform call hyperscalers as well as public cloud companies) rose by 24 percent in Q4, according to Swan, and telecommunications and service provider companies both grew at close to 30 percent in the period as well But sales of products that ended up in enterprises or government agencies both fell by 7 percent in the period. The growth on one side is enough to fill in the gap on the other, but it is not enough to keep Intel at that target of 15 percent sustained growth for Data Center Group.
https://www.nextplatform.com/2017/01/27/skylake-xeon-ramp-cuts-intels-datacenter-profits/

Physicists have found a metal that conducts electricity but not heat

Researchers have identified a metal that conducts electricity without conducting heat - an incredibly useful property that defies our current understanding of how conductors work. The metal contradicts something called the #WiedemannFranzLaw, which basically states that good conductors of electricity will also be proportionally good conductors of heat, which is why things like motors and appliances get so hot when you use them regularly.

But a team in the US has shown that this isn't the case for #metallicvanadiumdioxide (VO2) - a material that's already well known for its strange ability to switch from a see-through insulator to a conductive metal at the temperature of 67 degrees Celsius (152 degrees Fahrenheit).

"This was a totally unexpected finding," said lead researcher Junqiao Wu, from Berkeley Lab’s Materials Sciences Division.

"It shows a drastic breakdown of a textbook law that has been known to be robust for conventional conductors. This discovery is of fundamental importance for understanding the basic electronic behaviour of novel conductors."

Not only does this unexpected property change what we know about conductors, it could also be incredibly useful - the metal could one day be used to convert wasted heat from engines and appliances back into electricity, or even create better window coverings that keep buildings cool.

Researchers already know of a handful of other materials that conduct electricity better than heat, but they only display those properties at temperatures hundreds of degrees below zero, which makes them highly impractical for any real-world applications.

Vanadium dioxide, on the other hand, is usually only a conductor at warm temperatures well above room temperature, which means it has the ability to be a lot more practical.

http://www.sciencealert.com/physicists-have-found-a-metal-that-conducts-electricity-but-not-heat

THIS WEEK IN ENTERPRISE TECH 224 VMWare: Hyper Converged Infrastructure

A deep dive into the #hyperconverged infrastructure market and #VMware #vSAN with a look at the 2017 VMWare roadmap.

https://twit.tv/shows/this-week-in-enterprise-tech/episodes/224

Toshiba is selling off part of its memory business

#Toshiba on Friday officially announced it will sell a portion of its #flashmemory business, including the #SSD business of the Storage & Electronic Device Solutions Division, to a not-yet-named buyer. The company, which invented #NAND flash in the early 1980s, announced last week it was exploring spinning off its memory business. A Nikkei's Asian Review said Toshiba had been considering spinning off its semiconductor operations and selling a partial stake to #WesternDigital ( #WD ), "as it tries to cope with a massive impairment loss in its U.S. nuclear power unit."

http://www.computerworld.com/article/3162510/data-storage/toshiba-is-selling-off-part-of-its-memory-business.html

Configuring Eclipse for hadoop-"MapReduce"

SETTING UP THE ENVIRONMENT : #Eclipse is a IDE used for mainly for java programming. #MapReduce is a software framework associate with #java programming, using which we can write applications to batch process huge amounts of data, terabytes or petabytes of data stored in #Apache #Hadoop. MapReduce is a core component of Hadoop. So, we need to install and configure eclipse for MapReduce. To do that we need to have hadoop installed in our machine and HDFS must working fine. In this document we will install and configure for MapReduce and also run a program ( “WordCount.java” ) in Hadoop using eclipse. Required Software : Hadoop must be install and working fine. In this document hadoop 2.7.3 version will be used. Eclipse must be installed . INSTALLING HADOOP : We will consider Apache Hadoop is already installed and Hadoop Distributed File System (HDFS) is working fit and fine. If not then install hadoop. By storing some data in hdfs, retrieving data from hdfs, deleting and restaring machine check hdfs workoing fine or not. Do all kind of operations on hdfs. If something wrong happen then resolve it. INSTALLING & CONFIGURING ECLIPSE : DOWNLOADING ECLIPSE : We need eclipse to develop java programs so we need eclipse for java developers. To download it go to this page: https://eclipse.org/downloads/eclipse-packages/And download eclipse for Linux 64bit OS and select Eclipse IDE for Java developers. EXTRACTING THE TAR FILE : Go to that directory through terminal where the downloaded file resides. Suppose that directory is /home/suto/Downloads/downloads and extract the downloaded file (say ‘eclipse-java-neon-2-linux-gtk-x86_64.tar.gz’) by following commands : $cd /home/suto/Downloads/downloads $tar -zxvf eclipse-java-neon-2-linux-gtk-x86_64.tar.gz After successful extraction we will get a new folder named ‘eclipse’. DOWNLOADING HADOOP ECLIPSE PLUG-IN : Hadoop Eclipse Plug-in provides tools to ease the experience of Map/Reduce on Hadoop. Hadoop eclipse plug-in is used to associate all the hadoop accessories with eclipse. It supports to create Map, Reduce and Driver classes. Also helps eclipse to browse and interacts with hdfs, submitting jobs and monitoring on their execution. To download it go to this web page : https://github.com/Ravi-Shekhar/hadoop-eclipse-plugin/blob/master/release/hadoop-eclipse-plugin-2.6.0.jar After downloading the plug-in.jar file copy that file and paste this file in the folder ‘/eclipse/plugins’. CREATE DESKTOP ICON FOR ECLIPSE : Open terminal and edit ‘eclipse.desktop’ file which resides in ‘/usr/share/applications/’ directory and change all the following under desktop entry. Assign the path of ‘icon.xpm’ to ‘Icon’. Here icon.xpm resides in ‘/home/suto/Downloads/downloads/eclipse’ so We will use ‘Icon=/home/suto/Downloads/downloads/eclipse/icon.xpm’. Also change ‘Exec’ path to ‘Exec=/home/suto/Downloads/downloads/eclipse/eclipse’: $ sudo gedit /usr/share/applications/eclipse.desktop [Desktop Entry] Type=Application Name=Eclipse Comment=Eclipse Integrated Development Environment Icon=/home/suto/Downloads/downloads/eclipse/icon.xpm Exec=/home/suto/Downloads/downloads/eclipse/eclipse Terminal=false Categories=Development;IDE;Java; StartupWMClass=Eclipse Save the file.Then in the finder search for Eclipse when the menu comes up - just drag it to the launcher and then it will start working(fig2.5).Now run it from the launcher. CREATING PROJECT FOR HADOOP : Choose Map Reduce Perspective from Top-Right corner of eclipse IDE.Now, Go to top left corner and open File--> New --> Map reduce Project. Give it a name. in this document I am using the project name as ‘hadoop’ and Select the option ‘specify Hadoop library location’ and Browse it to the folder where hadoop is installed. In this document we set it /usr/local/hadoop. Click Next and finish and your ‘hadoop’ project will be created. SETTING DFS LOCATIONS : Select the Map/Reduce locations TAB at the bottom of the screen.Right click on the blank space and choose ‘new hadoop location’.Give Location name e.g. "master". Give host name e.g. "localhost". Give Port number Map Reduce = 9001 and DFS master = 9000 . Click on finish. This sets the hadoop server with eclipse also we can access, execute and modify data files through eclipse. But stil it will show error connecting to hdfs. So, Close the eclipse. CREATING DIRECTORIES IN HDFS : At first start all hadoop daemons if they are not started yet by the command start-all.sh and jps to check all the daemons are running or not. $ start-all.sh $jps Open terminal and go to this directory /usr/local/hadoop/bin and run this command to make a directory hadoop fs -mkdir /user/suto/input. Its a laborious to go this directory each time to execute the hadoop commands so its better to add this directory to PATH variable. So, we can execute our hadoop commands without changing the directory. $ export PATH=$PATH:/usr/local/hadoop/bin $ hadoop fs -mkdir /usr/suto/input Now you can create more directories say input1, input2, etc by this procedure and check them out by this command haoop fs -ls /. Also you can check it from eclipse. Re-open the eclipse and see under DFS location under master new directories are created. Now everything is set. You can perform any program through eclipse.

https://www.ibm.com/developerworks/community/blogs/d9a07ec3-11e2-467d-b758-6861c4cb1d44/entry/Configuring_Eclipse_for_hadoop_MapReduce?lang=en

Graphene achieves superconductivity breakthrough: A whole new way to move electrons without resistance

Since #graphene ’s discovery just over a decade ago, scientists have been exploring its remarkable properties and potential uses in a wide range of applications, including that of a superconductor. #Superconductivity is the ability of certain materials to enable the flow of an electric current with little or zero resistance. This is usually only achieved at very low temperatures, which makes superconductivity rather expensive and currently impractical for many applications. (RELATED: See more news about advances in science at Scientific.news.) Early on, it was theorized that graphene might have superconductive properties, but until now, researchers have been unable to harness its potential without involving other materials in the process. But now, a team of researchers have reported finding a method to unlock graphene’s superconductivity without having to insert calcium atoms into its latticework or place it on another superconducting material – the only methods discovered so far that were able to make graphene display superconductivity. Graphene is an amazing material to begin with – it’s a two-dimensional sheet of carbon atoms that happens to be extremely strong, flexible, lightweight, and conductive. It has the potential to revolutionize a number of technologies and its as-of-yet untapped properties include superconductivity – a theorized potential that now appears to have been confirmed. As mentioned above, researchers have previously only been able to achieve superconductivity with graphene by using other materials in tandem with it. The latest research, conducted by a team of scientists at the University of Cambridge, involved a new approach – one that not only showed graphene is capable of achieving superconductivity on its own, but which also hinted at confirming another postulated theory regarding graphene’s mysterious superconductive properties.

http://www.naturalnews.com/2017-01-28-graphene-achieves-superconductivity-breakthrough-a-whole-new-way-to-move-electrons-without-resistance.html

Saturday, January 28, 2017

Cisco, Bosch, and Foxconn are building blockchain tech for the internet of things

#Cisco, #Bosch, and several other companies have set up a consortium to work on how blockchain can be used to secure and improve “internet of things” applications, as sectors beyond finance seek to benefit from #bitcoin ’s underlying technology. The group, which also includes Bank of New York Mellon, #Foxconn, security company #Gemalto and #blockchain startups Consensus Systems (ConsenSys), BitSE and Chronicled said on Friday that they will collaborate to develop a shared blockchain protocol for the internet of things – the concept that everyday objects, from washing machines to shipping containers, will be connected to the internet and will be able to send and receive data.   While having more devices connected to the internet presents some advantages for consumers and businesses, it also increases the scope of devices which could be hacked. Blockchain is a tamper-proof distributed record of transactions that is maintained by a network of computers on the internet and secured through advanced cryptography. Proponents of the nascent technology believe it could be used to provide additional security and better identity management features to internet of things applications. “We are seeing tremendous potential for the application of blockchain in industrial use cases,” said Dirk Slama, chief alliance officer at Bosch Software Innovations. “Being able to create a tamperproof history of how products are manufactured, moved and maintained in complex value networks with many stakeholders is a critical capability …” The consortium is one of several collaborative efforts by large companies aimed at advancing the development of blockchain technology. Around 40 banks are members of a blockchain consortium run by startup R3 CEV, while technology firms such as IBM and Hitachi are part of a consortium led by the Linux Foundation.

venturebeat.com/2017/01/28/cisco-bosch-and-foxconn-are-building-blockchain-tech-for-the-internet-of-things/

With supplies tight, memory chipmakers head into ultra-super-cycle

The global #memorychip industry is heading into what's been dubbed an ultra-super-cycle, as the challenge of making chips smaller yet more efficient has created supply bottlenecks, while there is soaring demand for data storage - from smartphones and artificial intelligence to autonomous driving and the Internet of Things. Chipmakers and analysts predict the price rally - the average price of benchmark memory chips rose 26-31 percent last year - will continue this year as supplies remain tight. "We expect an ultra-super-cycle instead of just a super-cycle in the 2017 DRAM industry," said CW Chung, an analyst at Nomura, referring to memory chips used in smartphones and computers for short-term data processing and storage. That's left gadget makers scurrying to secure stable supplies, and distributors reporting shipment delays, while chipmakers enjoy bumper earnings. For example, Samsung Electronics, the world's biggest memory chipmaker, this week reported record quarterly operating profit of 4.95 trillion won ($4.26 billion) at its chip business, and its stock price has risen 77 percent over 12 months, a period that includes one of the consumer electronics industry's most damaging product faults. "As of the end of the fourth quarter, our DRAM inventory in particular has gotten tight compared to the previous period after we actively responded to demand," Chun Se-won, Samsung Electronics senior vice president, said after the earnings. Samsung did not detail its inventory levels, but some analysts reckon its DRAM inventory level fell to less than a week at end-December, from nearer a month a year ago. BNP estimates the industry-wide inventory of NAND flash memory chips, used for longer-term data storage, is also less than one week. Toshiba, which may sell part of its core chip business for unrelated financial reasons, said it is receiving orders beyond its capacity for NAND chips, and SK Hynix, while meeting orders for now, warned that an industry-wide shortage of NAND chips will likely persist this year. Leading Chinese smartphone makers such as Huawei [HWT.UL] and ZTE declined to comment on chip supplies. Alibaba-backed Meizu said it has no problems in its smartphone launch or shipment plans. "We have a long-term agreement with our suppliers that ... guarantees more than 3 months of supply at any given moment," global branding manager Ard Boudeling told Reuters.

http://mobile.reuters.com/article/idUSKBN15B0UM

Intel shipping Optane memory modules to partners for testing

#Intel announced its first market ready #Optane products at CES 2017, in the form of compact slot-format low capacity ' #cacheSSD s'. These are its first Optane storage solutions. Intel will also be releasing #Optanememory DIMMs to fit DDR4 slots in servers, PCs and so on. In an earnings call a few hours ago, in the wake of its latest financials, Intel's CEO said Optane memory DIMMs had started to ship to partners for testing. While we have seen impressive demonstrations of Optane storage solutions, less is known about the impact that 3D XPoint technology will have when used in place of RAM DIMMs. Intel gave us some insight into its future plans for #3DXPoint memory solutions a couple of years back when we first heard about this new non-volatile memory tech.

http://m.hexus.net/tech/news/ram/101929-intel-shipping-optane-memory-modules-partners-testing/

SK Hynix Lays Out Plans for 2017: 10nm-Class DRAM, 72-Layer 3D NAND

#SKHynix this week announced financial results for its fiscal year 2016 and also revealed general plans for 2017. As expected, the company intends to start volume production of new types of memory and expand production capacities. What is noteworthy is that the company will primarily invest in the expansion of #NAND flash manufacturing capacities, rather than the expansion of DRAM production, in the short-term future. DRAM: 21 nm Ramping, 18 nm(?) on Track for 2H 2017 SK Hynix began to make DRAM using its 21 nm fabrication process in late 2015. The manufacturer has been gradually expanding usage of the technology as well as improving its yields since then. By now, SK Hynix makes a wide range of its products (including mainstream DRAM, mobile DRAM and specialized memory) using its 21 nm manufacturing process. This week the company confirmed that it intends to start volume production of DRAM using its 10 nm-class process technology (which industry experts believe is 18 nm) this year

http://www.anandtech.com/show/11079/sk-hynix-lays-out-plans-for-2017-10nm-dram-72-layer-nand

Microsoft CEO Satya Nadella: How Azure will fend off Amazon Web Services in the enterprise

#Microsoft believes that its unique combination of cloud offerings will hold off market-leading public-cloud service #AmazonWebServices in Microsoft’s traditional stronghold of the enterprise, CEO #SatyaNadella said in phone call with press and analysts after the release of quarterly earnings Thursday. Microsoft is the number-two public-cloud company, competing energetically with AWS in an ongoing battle of pricing and features. Cloud computing has always appealed to startups, freeing them from the financial burden of buying, housing and maintaining computing and networking hardware. Instant expandability (and contraction, if necessary) of both compute and storage, plus the availability of APIs into esoteric and otherwise-out of reach services such as machine learning and voice recognition, make the cloud an easy choice for new companies.

http://www.geekwire.com/2017/microsoft-ceo-satya-nadella-azure-will-fend-off-amazon-web-services-enterprise/

Juniper Networks Growth Driven by Cloud

#Juniper generated $5B in revenue in 2016, with particular success for its new QFX product line. Juniper Networks reported its fourth quarter and full fiscal year 2016 financial results on Jan. 26, showing strength in the fast growing cloud market. For the quarter, Juniper reported revenue of $1.4 billion for a five percent year-over-year gain. Net Income for the fourth quarter was flat at $197.4 million. Full year Net Income was up by three percent year-over-year to approximately $5 billion. Net Income for the year was reported at $601.2 million. Looking forward, Juniper provided first quarter fiscal 2017 guidance for revenue to be approximately $1.2 billion. Among the big successes for Juniper is the company's QFX portfolio of high performance data center switches. During Juniper's earnings call with financial analysts, CEO Rami Rahim commented that the QFX family of products saw strong demand with revenue increasing approximately 90 percent year-over-year in the fourth quarter and over 50 percent for fiscal year 2016.

http://mobile.enterprisenetworkingplanet.com/netsysm/juniper-networks-growth-driven-by-cloud.html

Quantum Computing Progress Will Speed Up Thanks to Open Sourcing

In the quest for ever more powerful computers, researchers are beginning to build #quantucomputers—machines that exploit the strange properties of physics on the smallest of scales. The field has been making progress in recent years, and quantum computing company #DWave is one of the pioneers. Researchers at #Google, #NASA, and elsewhere have been studying how they can use D-Wave’s chips to solve tricky problems far faster than a classical computer could. Although the field is making progress, it is still largely the domain of an elite group of physicists and computer scientists. However, more minds working a problem tend to be better than fewer. And to that end, D-Wave took a bold step toward democratizing quantum computing last week by releasing an open-source version of its basic quantum computing software, Qbsolv. “D-Wave is driving the hardware forward,” D-Wave International president Bo Ewald told Wired. “But we need more smart people thinking about applications, and another set thinking about software tools.” Qbsolv is intended to allow more developers to program D-Wave’s computers without the requisite PhD in quantum physics. That is, more people will be able to think about how they’d use a quantum computer—and even begin writing software applications too. This has profound implications for the future of computing. But first, a little background. What is quantum computing? To understand the significance of D-Wave's announcement, let's take a quantum leap back to the 80s. In 1982, Nobel Prize-winning physicist Richard Feynman suggested that computing could become inconceivably faster by utilizing the basic laws of quantum physics. While digital computers already use physics to process binary digits — or bits — comprised of 1s and 0s, Feynman suggested not using bits at all but rather quantum bits, or qubits. Unlike classical bits, qubits can exist simultaneously as both 1s and 0s. This might be described as the probability a qubit is either 1 or 0, but it's actually more subtle than that and relies on a property intrinsic to quantum physics that is impossible to emulate using simple probabilities.

https://singularityhub.com/2017/01/28/quantum-computing-progress-will-speed-up-thanks-to-open-sourcing/

Huge Growth Expected for OpenStack and Hadoop, Despite Skills Gap

Market research reports are flowing in forecasting huge growth for open platforms #OpenStack and #Hadoop. Last year, #Technavio 's market research analysts predicted that the global Hadoop market would grow at a CAGR of more than 53% over the next four years. The researchers cited several factors such as data explosion in enterprises and demand for cost-effective solutions to meet big data analytics needs contributing to the growth of this market. Now, Technavio researchers are out with a new study forecasting that the global cloud management for OpenStack market will grow at a CAGR of 30.49% over the next four years. In both the Hadoop report and the OpenStack one, though, there are citations referring to a skills gap, where organizations are having trouble hiring skilled OpenStack and Hadoop technologists. You can download a sample of Technavio's OpenStack report here. Allied Market Research has also forecasted that the global market for Hadoop along with related hardware, software, and services will reach $50.2 billion by 2020, propelled by greater use of raw, unstructured, and structured data. But there remains a pronounced shortage of skilled Hadoop workers. According to Technavio's report, many organizations may address the shortage of strong Hadoop professionals by adopting Software-as-a-Service (SaaS) Hadoop solutions:

http://ostatic.com/blog/huge-growth-expected-for-openstack-and-hadoop-despite-skills-gap

THE STATE OF TENNESSEE TREASURY DEPARTMENT HAS $4,393,000 POSITION IN NETAPP, INC. (NTAP)

#TennesseeTreasuryDepartment raised its stake in shares of #NetApp, Inc. (NASDAQ:NTAP) by 91.0% during the third quarter, Holdings Channel reports. The fund owned 122,653 shares of the data storage provider’s stock after buying an additional 58,437 shares during the period. State of Tennessee Treasury Department’s holdings in NetApp were worth $4,393,000 at the end of the most recent quarter. A number of other institutional investors have also modified their holdings of the stock. Dodge & Cox increased its position in shares of NetApp by 0.7% in the second quarter. Dodge & Cox now owns 36,281,441 shares of the data storage provider’s stock valued at $892,161,000 after buying an additional 246,450 shares during the last quarter. Vanguard Group Inc. increased its position in shares of NetApp by 1.6% in the second quarter. Vanguard Group Inc. now owns 25,692,849 shares of the data storage provider’s stock valued at $631,787,000 after buying an additional 401,590 shares during the last quarter. State Street Corp increased its position in shares of NetApp by 0.7% in the second quarter. State Street Corp now owns 12,594,917 shares of the data storage provider’s stock valued at $309,710,000 after buying an additional 83,169 shares during the last quarter. BlackRock Institutional Trust Company N.A. increased its position in shares of NetApp by 14.8% in the second quarter. BlackRock Institutional Trust Company N.A. now owns 9,225,320 shares of the data storage provider’s stock valued at $226,851,000 after buying an additional 1,189,529 shares during the last quarter. Finally, BlackRock Fund Advisors increased its position in shares of NetApp by 2.9% in the third quarter. BlackRock Fund Advisors now owns 5,001,396 shares of the data storage provider’s stock valued at $179,150,000 after buying an additional 142,765 shares during the last quarter. Hedge funds and other institutional investors own 89.98% of the company’s stock.

http://dailyquint.com/2017-01-28-state-of-tennessee-treasury-department-has-4393000-position-in-netapp-inc-ntap/

Docker storage opens new frontiers in quest for enterprise adoption

#Docker storage is the hottest new battleground in the #container wars as competitors clash over enterprise data center territory. Early adopters have debated whether persistent storage and stateful applications are appropriate for containerization, but enterprise IT pros will require these features as they adopt containers. Legacy apps won't disappear, they say, and data portability through Docker has become a multicloud management dream, especially for multinational companies.

http://searchitoperations.techtarget.com/news/450411882/Docker-storage-opens-new-frontiers-in-quest-for-enterprise-adoption

The limits of RAID: Availability vs durability in archives

There's a strong tendency in storage to go for the one-size-fits-all approach. Nobody wants to manage two systems when one will do. But you need to understand exactly what your use case requires - and ensure the critical success factors are well implemented. That's the case with RAID arrays and archiving. Until about 5 years ago, you had two choices for enterprise archives: tape silos or RAID arrays. Tape silos are still a low-cost way to archive massive amounts of data, but many enterprises look at RAID arrays - especially fully depreciated ones - and think they can save money by re-purposing them.

http://www.zdnet.com/article/availability-vs-durability-in-archives/

FusionStorm Parlays Innovation Drive, Perennial Partner Of Year Honors Into Prestigious Dell EMC Titanium Black Membership

#FusionStorm, which has captured more than a dozen Partner of the Year vendor awards over the past six years, has been handpicked as one of just a handful of elite partners to join the ultra-exclusive #DellEMC #TitaniumBlack program. The San Francisco-based company, which has been widely recognized for driving innovative solution offerings in the fast-growing hyper-converged, software-defined and cloud markets, also sits on the advisory boards of Dell, EMC and #VMware.  The "invitation-only" Titanium Black honor gives FusionStorm access to a wide range of exclusive benefits aimed at driving sales growth. The honor puts the $700 million company, which has its sights set on the $1 billion sales mark, in the same league as a select group of much larger competitors including $14 billion behemoth #CDW, $9 billion powerhouse #WorldWideTechnology and $5.4 billion #Insight Enterprises.

http://m.crn.com/news/channel-programs/300083556/fusionstorm-parlays-innovation-drive-perennial-partner-of-year-honors-into-prestigious-dell-emc-titanium-black-membership.htm

Dell EMC to up India headcount by 15-20%

#DellEMC is ramping up its India R&D headcount by 15-20 per cent by the end of December 2017 and is looking for capabilities across the entire spectrum of engineers - from software engineers right up to the highest level of principal engineers; a top executive told BusinessLine. Systems software, enterprise solutions, systems management software and cloud management software are the areas of expertise the company is looking to hire. #DellIndia

http://m.thehindubusinessline.com/info-tech/dell-emc/article9505991.ece

6 Key Details Of The New Dell EMC Partner Program by

Down To The Wire #DellEMC Channel Chief John Byrne and his team have begun informing solution providers about how the company's new, unified channel program will work, previewing boosted MDF, a significant investment in incentives and other measures intended to encourage partners to sell across the vendor's broad portfolio and work exclusively with Dell EMC. Solution providers will begin working with the new Dell EMC program when it is officially rolled out the first week of February. Byrne has arranged the program into Gold, Platinum and Titanium tiers with a super-exclusive, invitation-only Titanium Black designation that already has been awarded to a handful of solution providers. Byrne has promised ironclad deal registration, as well as a seamless blending of the legacy Dell and EMC programs to provide profitability, simplicity and predictability. In a webcast this week, Byrne and his team gave about 3,000 solution providers further details about the program. Here are six critical points.

http://m.crn.com/slide-shows/data-center/300083544/6-key-details-of-the-new-dell-emc-partner-program.htm?itc=most_pop

Red Hat Rolls Out OpenShift Container Platform 3.4

#RedHat just launched Red Hat #OpenShiftContainer Platform 3.4. It's based on #RedHatEnterpriseLinux, plus #Kubernetes 1.4 and the #Docker container runtime. The goal is to deploy container-based applications an microservices on a "a stable, reliable and more secure enterprise platform." Here's how Red Hat describes some of the new features: Next-level container storage with support for dynamic storage provisioning, allowing multiple storage types to be provisioned, and multi-tier storage exposure via quality-of-service labels in Kubernetes. Container-native storage, enabled by Red Hat Gluster Storage, which now supports dynamic provisioning and push button deployment, enhances the user experience running stateful and stateless applications on Red Hat OpenShift Container Platform. It makes the consumption and provisioning of application storage easier for developers to use. With Red Hat Gluster Storage, OpenShift customers get the added benefit of a software-defined, highly available and scalable storage solution that works across on-premises and public cloud environments and one that can be more cost efficient than traditional hardware-based or cloud-only storage services.

https://virtualizationreview.com/articles/2017/01/26/red-hat-rolls-out-openshift-container-platform-3_4.aspx?m=1

Five Key Capabilities Surrounding a Top-Notch Hadoop Environment

A data environment supported by #Apache #Hadoop is not for the technically faint-of-heart. However, those that take the plunge into this world of flexible, scalable big data may find themselves with a leg up in multiple facets of their data environment beyond just storing it and managing it. In a recent report, Aberdeen's research suggests that Hadoop usage could be a catalyst for an enhanced and well-rounded data strategy. Accordingly to the research, companies that have current Hadoop deployments are also more likely to exploit a variety of technologies and capabilities.

http://www.techrepublic.com/resource-library/whitepapers/five-key-capabilities-surrounding-a-top-notch-hadoop-environment/

Global External Controller-based Disk Storage Market 2017- EMC, IBM, NetApp

Worldwide External Controller-based Disk Storage Market 2017 Research report is an in-depth analysis of 2017 global External Controller-based Disk Storage market on the current state. First of all, the report (2017 External Controller-based Disk Storage Market) provides a basic overview of the External Controller-based Disk Storage industry 2017 including – definitions, classifications, External Controller-based Disk Storage market by applications and External Controller-based Disk Storage industry chain structure. The 2017’s report on External Controller-based Disk Storage Industry analysis is provided for the international External Controller-based Disk Storage market including development history, External Controller-based Disk Storage industry competitive landscape analysis, and major regions development status on External Controller-based Disk Storage scenario. After that, 2017 Worldwide External Controller-based Disk Storage Market report includes development policies and plans are discussed. External Controller-based Disk Storage market 2017 report also covers manufacturing processes and cost structures on External Controller-based Disk Storage Scenario. This report also states External Controller-based Disk Storage import/export, supply, External Controller-based Disk Storage consumption figures as well as cost, price, External Controller-based Disk Storage industry revenue and gross margin by regions (United States, EU, China, and Japan). Then, the report focuses on global major leading External Controller-based Disk Storage industry players with information such as company profiles, product picture and specification, capacity, External Controller-based Disk Storage production, price, cost, External Controller-based Disk Storage Market revenue and contact information. Top Manufacturers Analysis in External Controller-based Disk Storage market 2017:- 1 EMC 2 IBM 3 NetApp 4 Hitachi Data Systems 5 HP 6 Dell 7 Fujitsu 8 Oracle 9 Huawei

http://registrardaily.com/2017/01/27/global-external-controller-based-disk-storage-market-2017/

IBM will build $62 million cloud data center for Army at Alabama base

#IBM will build, manage and operate a private cloud data center for the U.S. Army at Huntsville's Redstone Arsenal, the company says. The deal is worth $62 million to IBM. It is a pilot for the first of four private Army cloud centers that will consolidate hundreds of military data sites around the world. Sam Gordy, general manager of IBM's U.S. Federal division and Tim Kleppinger, vice president and senior client partner of IBM U.S. Federal, discussed the deal and its implications in a Q&A with AL.com this week.

http://www.al.com/news/huntsville/index.ssf/2017/01/ibm_will_build_62_million_clou.html

Microsoft’s cloud strategy leads to another winning quarter on Wall Street

#Microsoft just released its fiscal second quarter earnings report, posting revenue of $24.1 billion and net income of $5.2 billion, beating Wall Street estimates in both cases. The company’s huge acquisition of LinkedIn closed fairly late in the quarter, but still got plenty of attention in the earnings release. Microsoft says LinkedIn brought in revenue of $228 million and a net loss of $100 million for the period “beginning on December 8th, 2016.” LinkedIn is reported under the Productivity and Business Processes segment of Microsoft’s Q2 earnings. It was a big quarter for Microsoft’s cloud strategy as the company continues inching up on industry leader Amazon Web Services. Azure revenue increased by an impressive 93 percent “with Azure compute usage more than doubling year-over-year,” according to the earnings release. The More Personal Computing business was down 5 percent, which Microsoft continues to attribute to still-falling phone revenues. Those were down 81 percent. Windows OEM revenue jumped by 5 percent after a flat Q1, and revenue for the Windows commercial products and cloud services division was also up 5 percent. Gaming dropped by 3 percent year-over-year, which Microsoft explained was due to “lower Xbox console revenue offset by Xbox software and services revenue growth” — basically the same thing we heard last quarter. But Xbox Live subscriptions are a bright spot, now up to 55 million monthly active users — a new record — versus 47 million last quarter. Revenue for the Surface division was down 2 percent compared to this time last year. CEO Satya Nadella will be participating in the company’s investor call this afternoon to speak on Microsoft’s financial outlook and strong-performing cloud business

http://www.theverge.com/2017/1/26/14402584/microsoft-q2-2017-earnings

Why a partnership with Huawei is a big deal for Red Hat

An expanded partnership with Chinese telecom #Huawei is being touted as a big deal by #RedHat (NYSE: RHT). The Raleigh-based open-source software firm is expanding its cooperation with Huawei to include public and network functions virtualization clouds. #nfv

http://www.bizjournals.com/triangle/news/2017/01/26/why-a-partnership-with-huawei-is-a-big-deal-for.html

Thursday, January 26, 2017

The incredible lunar TEMPLE: European space bosses reveal plan for 50m high 'dome of contemplation'

It could be a temple like no other - and with a view that really is out of this world. The #EuropeanSpaceAgency has revealed plans for a lunar temple to be built alongside mankind's first outpost on the moon. The 50m high dome, close to a planned moonbase near to the moon's south pole, would give the first settlers 'a place of contemplation'. Artist Jorge Mañes Rubio, part of ESA's future-oriented Advanced Concepts Team (ACT), designed the temple to be built alongside ESA's planned moonbase

http://www.dailymail.co.uk/sciencetech/article-4162296/The-incredible-lunar-TEMPLE.html

VMware Inc.: VMware Reports Fourth Quarter and Full Year 2016 Results

Record Annual Revenue Exceeds $7 Billion; Q4 Revenue $2.03 billion, up 9% y-y; Additional $1.2B in Stock Repurchases Authorized through February 2018 PALO ALTO, CA -- (Marketwired) -- 01/26/17 -- VMware, Inc.(NYSE: VMW), a leader in cloud infrastructure and business mobility, today announced financial results for the fourth quarter and full year of 2016:

Revenue for the fourth quarter was $2.03 billion, an increase of 9% from the fourth quarter of 2015.
License revenue for the fourth quarter was $887 million, an increase of 8% from the fourth quarter of 2015.
GAAP net income for the fourth quarter was $441 million, or $1.04 per diluted share, up 18% per diluted share compared to $373 million, or $0.88 per diluted share, for the fourth quarter of 2015. Non-GAAP net income for the quarter was $597 million, or $1.43 per diluted share, up 13% per diluted share compared to $534 million, or $1.26 per diluted share, for the fourth quarter of 2015.
GAAP operating income for the fourth quarter was $543 million, an increase of 21% from the fourth quarter of 2015. Non-GAAP operating income for the fourth quarter was $747 million, an increase of 14% from the fourth quarter of 2015.
Operating cash flows for the fourth quarter were $463 million. Free cash flows for the quarter were $419 million.
Total revenue plus sequential change in total unearned revenue grew 13% year-over-year.
License revenue plus sequential change in unearned license revenue grew 14% year-over-year.
Annual Review
Revenue for 2016 was $7.09 billion, an increase of 8% from 2015 on a GAAP basis and up 7% on a non-GAAP basis.
License revenue for 2016 was $2.79 billion, an increase of 3% from 2015.
GAAP net income for 2016 was $1.19 billion, or $2.78 per diluted share, up 19% per diluted share compared to $997 million, or $2.34 per diluted share, for 2015. Non-GAAP net income for 2016 was $1.86 billion, or $4.39 per diluted share, up 8% per diluted share compared to $1.73 billion, or $4.06 per diluted share, for 2015.
GAAP operating income for 2016 was $1.44 billion, an increase of 20% from 2015. Non-GAAP operating income for 2016 was $2.29 billion, an increase of 9% from 2015.
Operating cash flows for 2016 were $2.38 billion. Free cash flows for 2016 were $2.23 billion.
Cash, cash equivalents and short-term investments were $7.99 billion, and unearned revenue was $5.62 billion as of December 31, 2016.
The company also announced that its Board of Directors has authorized the repurchase of up to $1.2 billion of its Class A common stock through the end of fiscal 2018, ending on February 2, 2018. Stock will be purchased from time to time, in the open market or through private transactions, subject to market conditions. The stock repurchase authorization is in addition to the Company's ongoing $500 million stock repurchase program announced in December 2016.
'Q4 closed out a strong fiscal 2016 and was one of the most balanced quarters for VMware in years,' said Pat Gelsinger, chief executive officer, VMware. 'We're very pleased with our strong product momentum and customer enthusiasm for our Cloud strategy. We believe we have the world's most complete and capable hybrid cloud architecture, uniquely offering customers freedom and control in their infrastructure decisions.'
Zane Rowe, executive vice president and chief financial officer, VMware, said, 'This was a very good year for VMware demonstrated by strong revenue, earnings and cash flow growth, as well as a significant amount of capital returned to shareholders in the form of stock repurchases. We're pleased to announce the authorization of an additional $1.2 billion of stock repurchases to be completed during fiscal 2018.'
Recent Highlights & Strategic Announcements
In October, VMware and Amazon Web Services announced a partnership to provide a new VMware vSphere®-based cloud service running on AWS. VMware Cloud™ on AWS will make it easier to run any application, using a common set of familiar software and tools, in a consistent hybrid cloud environment. This new service will be delivered, sold and supported by VMware and will be available later in 2017.
VMware hosted over 75,000 customers, partners and influencers at VMworld® 2016, VMworld 2016 Europe and across APJ vForums in 17 cities across 13 countries.
At VMworld 2016 Europe, the company introduced a wave of new products and services designed to help customers accelerate their digital transformation:
Now generally available, the new releases of VMware vSphere®, VMware vSAN™ and VMware vRealize® Automation™ all introduced support for containers, enabling developers to become more productive and IT to easily run containerized applications in production.
VMware further expanded the company's growing ecosystem of VMware Ready™ for Network Functions Virtualization (NFV) certified solutions, with 22 Virtual Network Functions now certified from 19 vendors worldwide. VMware's growing set of NFV partners helps global communications service providers adopt and deploy NFV to transform their operations and service portfolio with speed and confidence.
VMware was positioned as a leader in the 'IDC MarketScape: Worldwide Virtual Client Computing Software 2016 Vendor Assessment (doc # US40700016, November 2016).' The report evaluated eight vendors based on criteria that span across strategies and capabilities. VMware was recognized for the second consecutive year for having the most complete mix of business and solution strategies and capabilities for delivering virtual desktops and applications through its VMware Horizon® portfolio of solutions and services that includes VMware Horizon® 7, Horizon Air™, Horizon FLEX™ and VMware App Volumes™.
The company will host a conference call today at 2:00 p.m. PT/ 5:00 p.m. ET to review financial results and business outlook. A live web broadcast of the event will be available on the VMware investor relations website at http://ir.vmware.com. Slides will accompany the web broadcast. The replay of the webcast and slides will be available on the website for two months. In addition, six quarters of historical data for revenues which include year-over-year comparisons will also be made available at http://ir.vmware.com in conjunction with the conference call.
https://www.twst.com/update/vmware-inc-vmware-reports-fourth-quarter-and-full-year-2016-results/

News Bits: Cavium, Cisco, Linux, Commvault, Cumulus, Apstra, & Pure

This week’s News Bits we look at a number of small announcements, small in terms of the content not the impact they have. #Cavium will contribute the first programmable #Wedge100C Switch design to the #OpenComputeProject (OCP) Foundation. #Cisco has a handful of announcements including a new data and analytics training portfolio, all-in-one, cloud based meeting room product, and its intent to acquire #AppDynamics. #TheLinuxFoundation has several conferences around the world and will now have conferences in China in 2017. #Commvault has launched reference architecture for #AWS. #Cumulus ’s #Linux network OS will run on #Facebook ’s backpack switch. #Apstra announced integration with #FacebookWedge100. And #PureStorage announced general availability of its #FlashBlade product. Cavium Contributes 1st Programmable Wedge 100C Switch Design To OCP
http://www.storagereview.com/news_bits_cavium_cisco_linux_commvault_cumulus_apstra_pure

Hortonworks Touts Partnerworks Channel Program Success, Looks To International Expansion

One year after launching its #Partnerworks channel program, big data software developer #Hortonworks has grown its partner ranks by 400, introduced a certification track for managed service providers, expanded partner sales and engineering certification offerings, and improved communications with its 2,100 solution provider partners. # Hadoop "It was a pragmatic year" for Partnerworks, said Chris Sullivan, Hortonworks' senior vice president of channels and alliances, in an interview with CRN. "Our channel community, the partner community, is strategic and critical to our business. We're pleased with the results."  Hortonworks, founded in 2011, is a major player in the big data arena, marketing its #Hadoop -based Hortonworks #DataPlatform and #DataFlow products for managing huge volumes of data
http://m.crn.com/news/applications-os/300083529/hortonworks-touts-partnerworks-channel-program-success-looks-to-international-expansion.htm?itc=hp_ots

A Machine Learning Approach to Log Analytics

Learn how you can maximize #bigdata in the cloud with #Apache #Hadoop. Download this eBook now. Brought to you in partnership with #Hortonworks. Opening a #Kibana dashboard at any given time reveals a simple and probably overstated truth — there are simply too many logs for a human to process. Sure, you can do it the hard way, debugging issues in production by querying and searching among the millions of log messages in your system. #Hortonworks this is far from being a methodological and productive method. Kibana searches, visualizations, and dashboards are very effective ways to analyze a system, but a serious limitation of any log analytics platform, including the ELK Stack, is the fact that the people running them only know what they know. A Kibana search, for example, is limited to the knowledge of the operator who formulated it.

https://dzone.com/articles/a-machine-learning-approach-to-log-analytics

Microsoft has been investing in security firms in Israel

U.S. software firm #Microsoft will continue to invest over $1 billion annually on cyber security research and development in the coming years, a senior executive said. This amount does not include acquisitions Microsoft may make in the sector, Bharat Shah, Microsoft vice president of security, told Reuters on the sidelines of the firm's #BlueHat cyber security conference in Tel Aviv. "As more and more people use cloud, that spending has to go up," Shah said.  While the number of attempted cyber attacks was 20,000 a week two or three years ago, that figure had now risen to 600,000-700,000, according to Microsoft data. Long known for its Windows software, Microsoft has shifted focus to the cloud where it is dueling with larger rival Amazon to control the still fledgling market. In October it said quarterly sales from its flagship cloud product #Azure, which businesses can use to host their websites, apps or data, rose 116 percent. In addition to its internal security investments, Microsoft has bought three security firms, all in Israel, in a little over two years: enterprise security startup Aorato, cloud security firm Adallom, and Secure Islands, whose data and file protection technology has been integrated into cloud service Azure Information Protection. Financial details of these deals were not disclosed.

https://www.google.com/url?sa=t&rct=j&source=web&q=&url=http://www.cnbc.com/2017/01/26/microsoft-to-continue-to-invest-over-1-billion-a-year-on-cyber-security.html&usg=AFQjCNERSrzyCbMvH9tyoS1tV-WWa3ZI3w

VMware user group fight leaves community diminished

Ongoing tension between #Nutanix and #VMware has spilled over into #Virtzilla 's user groups, which have decided to exclude volunteers who work for rivals. That decision has left both vendors somewhat diminished and the user groups' governing body facing possible rebellion by individual user groups. This story starts in the last days of 2016, between Christmas and New Year's Day, when some leaders of individual VMware user groups were thanked for their voluntary service but told they were no longer welcome to serve on groups' leadership committees. Would-be sponsors of VMware user group conferences, day-long events that can attract hundreds of attendees, were also told that they and their money weren't welcome.

https://www.theregister.co.uk/2017/01/25/vmug_gate/