Report: Hoffenheim 2 Bayern 0

first_imgHoffenheim 2 Bayern Munich 0: Ancelotti’s men succumb to in-form Uth double Patric Ridge Last updated 2 years ago 02:26 10/9/2017 Hoffenheim - cropped Getty Images Mark Uth was at the double as Bayern Munich slumped to a surprise 2-0 defeat to Hoffenheim in the Bundesliga. Bayern Munich slumped to their first defeat of the season as Mark Uth’s clinical double claimed a surprise 2-0 Bundesliga victory for Hoffenheim.Carlo Ancelotti’s side looked leggy throughout Saturday’s encounter at the Rhein-Neckar-Arena, and Uth made sure that Hoffenheim – the last side to beat Bayern in the Bundesliga back in April – took full advantage.Starting up front in the absence of the injured Sandro Wagner, Uth capitalised on sloppy defending from Mats Hummels and Javi Martinez to put Hoffenheim ahead 27 minutes in, with Bayern having failed to test Oliver Baumann at the other end bar an early opportunity for Robert Lewandowski. Article continues below Editors’ Picks ‘I’m getting better’ – Can Man Utd flop Fred save his Old Trafford career? Why Barcelona god Messi will never be worshipped in the same way in Argentina Lyon treble & England heartbreak: The full story behind Lucy Bronze’s dramatic 2019 Liverpool v Man City is now the league’s biggest rivalry and the bitterness is growing And despite a positive start to the second half from Bayern, Uth doubled his tally and made it five goals in his last five games in all competitions with a precise finish at the culmination of a swift Hoffenheim counter-attack six minutes after the restart.The visitors pressed late on, but a late chance for Martinez – set-up by James Rodriguez on his Bayern debut – was all they could muster, with Hoffenheim holding firm to move level on points with current league leaders Borussia Dortmund.Bayern, meanwhile, must now switch their focus to the Champions League, with Anderlecht visiting the Allianz Arena on Tuesday.On his 400th Bayern appearance, Thomas Muller nearly turned provider with a pinpoint cross into Lewandowski, whose first-time effort clipped off the crossbar on its way over.Despite Bayern’s dominance, Hoffenheim kept the champions at arm’s length, and Julian Nagelsmann’s tactics paid off when quick thinking from Andrej Kramaric from a throw-in caught out Hummels.Uth reacted swiftly, stealing in unchallenged to drive into the area and plant a neat finish past Manuel Neuer at the near post.In search of an immediate response, Lewandowski tried his luck with a free-kick from range, but the Poland striker’s dipping shot inched wide of the left-hand upright.4 – Mark #Uth scored 4 of 8 Hoffenheim goals in all comps 2017-18. Pillar. #TSGFCB— OptaFranz (@OptaFranz) September 9, 2017Bayern’s determination to restore parity stretched their defence, though, with Neuer doing well to keep out Steven Zuber’s low drive prior to the interval, while Corentin Tolisso headed straight at Baumann.Kingsley Coman was the next to test Hoffenheim’s defence, drilling over from the edge of the hosts’ area after cutting inside from the left flank.But it was not to be for Bayern, who found themselves further behind when Uth side-footed home from 10 yards out having latched onto Zuber’s cut-back.Ancelotti was swift to make a change, bringing on Arjen Robben, though it was the introduction of Bayern’s next substitute – James – which almost paid dividends late on.Colombia star James whipped in a delightful free-kick that Martinez met, but, with the aid of the crossbar, Baumann managed to keep it out as Hoffenheim held firm for a well-deserved victory. Key Opta stats: – Thomas Muller made his 400th appearance in competitive games for Bayern Munich – since the club’s promotion to the top flight in 1965, only 13 players appeared in more games.- Mark Uth became the first player to score a brace for Hoffenheim against Bayern.- On matchday three, this is the earliest Bayern loss in a Bundesliga season since 2011-12. Back then they had lost on matchday one to Borussia Monchengladbach.- Hoffenheim have kept clean sheets in their last four league home games – a new record for the team.   read morelast_img read more

Sidney Crosbys Penguins Are The Best Penguins

History was made at Nashville’s Bridgestone Arena on Sunday night, when the Pittsburgh Penguins became the first team of the NHL’s salary-cap era to repeat as Stanley Cup champions.1The NHL instituted a salary cap after labor disputes that resulted in the loss of the entire 2004-05 season. Winning back-to-back titles wasn’t as big of a deal for much of the NHL’s history — through the 1970s and ’80s, it wasn’t uncommon to see teams win two, three or even four Stanley Cup titles in a row — but repeating has been notoriously difficult in recent decades.The last franchise to go back-to-back was the Detroit Red Wings, whose ridiculously talented Steve Yzerman-led teams won in 1997 and 1998. And before that, it was Mario Lemieux’s Penguins, buttressed by some teenager from the Czech Republic named Jaromir Jagr; they lifted the Cup in the springs of 1991 and 1992, cementing Pittsburgh as a hockey town.Just as those Penguins teams from the early 1990s owed a lot to their captain — Lemieux won the Conn Smythe Trophy as the playoff MVP in both Cup-winning campaigns — these Penguins have been powered by their leader, Sidney Crosby. He played brilliantly in this season’s playoffs, scoring 27 points in 24 games, including 7 in six games during the Final, and earning a second consecutive Smythe. Only one other player, beyond Crosby and Lemieux, has won back-to-back Smythes since the award was first given out in 1965.2That player is goalie Bernie Parent, who led the Philadelphia Flyers to Stanley Cup championships in 1974 and 1975. (And it was never done by all-time greats like Wayne Gretzky, Patrick Roy and Bobby Orr, although each of those players won the trophy at least twice in his career.)Of course, netminder Matt Murray wasn’t too shabby, either. After returning from injury to play in the conference finals, Murray was virtually unbeatable. In 11 games, he recorded seven quality starts3Hockey-Reference.com defines a “quality start” as one in which a goalie records a save percentage greater than or equal to the league average for the season. (Or, if a goalie faces 20 shots or fewer, he must record an 88.5 percent save percentage for the start to be considered “quality.”) and stopped 93.7 percent of the shots he faced.Here’s the most ludicrous thing of all: Murray led Pittsburgh to not one, but two titles as a rookie. After backstopping the Pens to the title last season, he still qualified as a rookie for 2016-17 because of the way the NHL judges rookie status. That elevates Murray into the same territory as Montreal great Ken Dryden, who as a rookie led the Habs to a Stanley Cup championship in 1971.Dryden won the Conn Smythe that year, and because he’d played in only six regular-season games, he still qualified as a rookie for the 1971-72 season. The Habs failed to repeat, but Dryden won the Calder Trophy as the league’s best rookie. Regardless of what Murray does over the rest of his career, he and Dryden will always be mentioned in the same breath. That’s not bad company!Beyond Crosby and Murray, Penguins center Evgeni Malkin was exceptional, finishing as the leading scorer in the playoffs. Geno’s 28 points are tied for the sixth-most of any player in a single postseason since the lockout and are the second-most of his playoff career (trailing the insane 36 points he dropped in 2009, when he won the Conn Smythe).In the first nine seasons they played together, Crosby and Malkin were playoff fixtures. They won one Cup, but otherwise, the Penguins during that time frequently seemed to disappoint in the postseason. After Pittsburgh’s championship in 2009, its record under coach Dan Bylsma was just 27-27 in the postseason, and the team was 0-5 in elimination games. Despite having two of the best players of their generation, the Pens were underachieving. The Crosby-Malkin era had held such promise, but each star was aging out of his prime. It was beginning to look like they might have missed their window for further championship success.All that panic feels like a dream now. Two championships in succession have put Pittsburgh’s tally during the Crosby-Malkin era at three — one more than the team earned in the Lemieux-Jagr era.So where does this place Crosby and Malkin in Penguins lore? It’s difficult (and kind of foolish) to compare eras. The game has changed a lot since Lemieux and Jagr played together, and Crosby and Malkin probably won’t touch their predecessors’ scoring totals. But in terms of titles, the Crosby-Malkin era has been the most successful run in the Penguins’ history. It’s hard to argue with all that silverware. read more

Why TensorFlow always tops machine learning and artificial intelligence tool surveys

first_imgTensorFlow is an open source machine learning framework for carrying out high-performance numerical computations. It provides excellent architecture support which allows easy deployment of computations across a variety of platforms ranging from desktops to clusters of servers, mobiles, and edge devices. Have you ever thought, why TensorFlow has become so popular in such a short span of time? What made TensorFlow so special, that we seeing a huge surge of developers and researchers opting for the TensorFlow framework? Interestingly, when it comes to artificial intelligence frameworks showdown, you will find TensorFlow emerging as a clear winner most of the time. The major credit goes to the soaring popularity and contributions across various forums such as GitHub, Stack Overflow, and Quora. The fact is, TensorFlow is being used in over 6000 open source repositories showing their roots in many real-world research and applications. How TensorFlow came to be The library was developed by a group of researchers and engineers from the Google Brain team within Google AI organization. They wanted a library that provides strong support for machine learning and deep learning and advanced numerical computations across different scientific domains. Since the time Google open sourced its machine learning framework in 2015, TensorFlow has grown in popularity with more than 1500 projects mentions on GitHub. The constant updates made to the TensorFlow ecosystem is the real cherry on the cake. This has ensured all the new challenges developers and researchers face are addressed, thus easing the complex computations and providing newer features, promises, and performance improvements with the support of high-level APIs. By open sourcing the library, the Google research team have received all the benefits from a huge set of contributors outside their existing core team. Their idea was to make TensorFlow popular by open sourcing it, thus making sure all new research ideas are implemented in TensorFlow first allowing Google to productize those ideas. Read Also: 6 reasons why Google open sourced TensorFlow What makes TensorFlow different from the rest? With more and more research and real-life use cases going mainstream, we can see a big trend among programmers, and developers flocking towards the tool called TensorFlow. The popularity for TensorFlow is quite evident, with big names adopting TensorFlow for carrying out artificial intelligence tasks. Many popular companies such as NVIDIA, Twitter, Snapchat, Uber and more are using TensorFlow for all their major operations and research areas. On one hand, someone can make a case that TensorFlow’s popularity is based on its origins/legacy. TensorFlow being developed under the house of “Google” enjoys the reputation of the household name. There’s no doubt, TensorFlow has been better marketed than some of its competitors. Source: The Data Incubator However that’s not the full story. There are many other compelling reasons why small scale to large scale companies prefer using TensorFlow over other machine learning tools TensorFlow key functionalities TensorFlow provides an accessible and readable syntax which is essential for making these programming resources easier to use. The complex syntax is the last thing developers need to know given machine learning’s advanced nature. TensorFlow provides excellent functionalities and services when compared to other popular deep learning frameworks. These high-level operations are essential for carrying out complex parallel computations and for building advanced neural network models. TensorFlow is a low-level library which provides more flexibility. Thus you can define your own functionalities or services for your models. This is a very important parameter for researchers because it allows them to change the model based on changing user requirements. TensorFlow provides more network control. Thus allowing developers and researchers to understand how operations are implemented across the network. They can always keep track of new changes done over time. Distributed training The trend of distributed deep learning began in 2017, when Facebook released a paper showing a set of methods to reduce the training time of a convolutional neural network model. The test was done on RESNET-50 model on ImageNet dataset which took one hour to train instead of two weeks. 256 GPUs spread over 32 servers were used. This revolutionary test has open the gates for many research work which have massively reduced the experimentation time by running many tasks in parallel on multiple GPUs. Google’s distributed TensorFlow has allowed all the researchers and developers to scale out complex distributed training using in-built methods and operations that optimizes distributed deep learning among servers. . Google’s distributed TensorFlow engine which is part of the regular TensorFlow repo, works exceptionally well with the existing TensorFlow’s operations and functionalities. It has allowed exploring two of the most important distributed methods: Distribute the training time of a neural network model over many servers to reduce the training time. Searching for good hyperparameters by running parallel experiments over multiple servers. Google has given distributed TensorFlow engine the required power to steal the share of the market acquired by other distributed projects such as Microsoft’s CNTK, AMPLab’s SparkNet, and CaffeOnSpark. Even though the competition is tough, Google has still managed to become more popular when compared to the other alternatives in the market. From research to production Google has, in some ways, democratized deep learning., The key reason is TensorFlow’s high-level APIs making deep learning accessible to everyone. TensorFlow provides pre-built functions and advanced operations to ease the task of building different neural network models. It provides the required infrastructure and hardware which makes them one of the leading libraries used extensively by researchers and students in the deep learning domain. In addition to research tools, TensorFlow extends the services by bringing the model in production using TensorFlow Serving. It is specifically designed for production environments, which provides a flexible, high-performance serving system for machine learning models. It provides all the functionalities and operations which makes it easy to deploy new algorithms and experiments as per changing requirements and preferences. It provides an excellent feature of out-of-the-box integration with TensorFlow models which can be easily extended to serve other types of models and data. TensorFlow’s API is a complete package which is easier to use and read, plus provides helpful operators, debugging and monitoring tools, and deployment features. This has led to growing use of TensorFlow library as a complete package within the ecosystem by the emerging body of students, researchers, developers, production engineers from various fields who are gravitating towards artificial intelligence. There is a TensorFlow for web, mobile, edge, embedded and more TensorFlow provides a range of services and modules within their existing ecosystem making them as one of the ground-breaking end-to-end tools to provide state-of-the-art deep learning. TensorFlow.js for machine learning on the web JavaScript library for training and deploying machine learning models in the browser. This library provides flexible and intuitive APIs to build and train new and pre-existing models from scratch right in the browser or under Node.js. TensorFlow Lite for mobile and embedded ML It is a TensorFlow lightweight solution used for mobile and embedded devices. It is fast since it enables on-device machine learning inference with low latency. It supports hardware acceleration with the Android Neural Networks API. The future releases of TensorFlow Lite will bring more built-in operators, performance improvements, and will support more models to simplify the developer’s experience of bringing machine learning services within mobile devices. TensorFlow Hub for reusable machine learning A library which is used extensively to reuse machine learning models. Thus you can transfer learning by reusing parts of machine learning models. TensorBoard for visual debugging While training a complex neural network model, the computations you use in TensorFlow can be very confusing. TensorBoard makes it very easy to understand and debug your TensorFlow programs in the form of visualizations. It allows you to easily inspect and understand your TensorFlow runs and graphs. Sonnet Sonnet is a DeepMind library which is built on top of TensorFlow extensively used to build complex neural network models. All of this factors have made the TensorFlow library immensely appealing for building a wide spectrum of machine learning and deep learning projects. This tool has become a preferred choice for everyone from space research giant NASA and other confidential government agencies, to an impressive roster of private sector giants. Road Ahead for TensorFlow TensorFlow no doubt is better marketed compared to the other deep learning frameworks. The community appears to be moving very fast. In any given hour, there are approximately 10 people around the world contributing or improving the TensorFlow project on GitHub. TensorFlow dominates the field with the largest active community. It will be interesting to see what new advances TensorFlow and other utilities make possible for the future of our digital world. Continuing the recent trend of rapid updates, the TensorFlow team is making sure they address all the current and active challenges faced by the contributors and the developers while building machine learning and deep learning models. TensorFlow 2.0 will be a major update, we can expect the release candidate by next year early March. The preview version of this major milestone is expected to hit later this year. The major focus will be on ease of use, additional support for more platforms and languages, and eager execution will be the central feature of TensorFlow 2.0. This breakthrough version will add more functionalities and operations to handle current research areas such as reinforcement learning, GANs, building advanced neural network models more efficiently. Google will continue to invest and upgrade their existing TensorFlow ecosystem. According to Google’s CEO, Sundar Pichai “artificial intelligence is more important than electricity or fire.” TensorFlow is the solution they have come up with to bring artificial intelligence into reality and provide a stepping stone to revolutionize humankind. Read more The 5 biggest announcements from TensorFlow Developer Summit 2018 The Deep Learning Framework Showdown: TensorFlow vs CNTK Tensor Processing Unit (TPU) 3.0: Google’s answer to cloud-ready Artificial Intelligencelast_img read more