Firefighters contain house fire in Pacific Highlands Ranch area

first_img Ed Lenderman, Posted: April 25, 2018 Firefighters contain house fire in Pacific Highlands Ranch area Updated: 9:54 AM Ed Lenderman center_img April 25, 2018 00:00 00:00 spaceplay / pause qunload | stop ffullscreenshift + ←→slower / faster ↑↓volume mmute ←→seek  . seek to previous 12… 6 seek to 10%, 20% … 60% XColor SettingsAaAaAaAaTextBackgroundOpacity SettingsTextOpaqueSemi-TransparentBackgroundSemi-TransparentOpaqueTransparentFont SettingsSize||TypeSerif MonospaceSerifSans Serif MonospaceSans SerifCasualCursiveSmallCapsResetSave SettingsSAN DIEGO (KUSI) — A family of four escaped their burning home this morning in San Diego’s Pacific Highlands Ranch neighborhood near Carmel Valley, but their garage and two vehicles were destroyed by the fire and smoke damaged much of the home, causing a loss of about a quarter-million dollars, authorities said.The blaze, reported about 4:15 a.m., prompted fire crews to evacuate several nearby homes in the 15000 block of Sierra Rose Trail near Blue Dawn Trail, San Diego Fire-Rescue Department officials said. No injuries were reported.Firefighters arrived to the upscale community near Carmel Valley to find flames and heavy smoke spewing from the garage of the home, SDFD spokesman Monica Munoz said. The crews quickly worked to evacuate nearby homes and search the burning house, though they later learned the two adults and two children who lived there had already evacuated safely.Fire crews knocked down the flames by 4:40 a.m. and extinguished a hot spot that flared up a few hours later.According to reporters at the scene, the home was equipped with new smoke alarms that alerted the family and allowed them plenty of time to escape unharmed. Personnel from the American Red Cross responded to assist the family find temporary shelter, Munoz said.Fortunately, no one was hurt and the damage was mostly confined to the garage, with a bit of extension into the second-floor bedroom above the garage,” Munoz said. “The combination of the fire door between the garage and the home, and quick action by firefighters, saved the first floor of this home from burning.”Two vehicles in the garage were destroyed and there was smoke damage throughout the house, Munoz said. Investigators estimate the fire caused $150,000 worth of structural damage and destroyed contents valued at $100,000.Metro Arson Strike Team investigators were on scene this morning but were unable to determine what caused the blaze. Categories: Local San Diego News FacebookTwitterlast_img read more

PHOTOS Wilmingtons Boy Scout Troop 136 Holds Car Wash Fundraiser For McLaren Family

first_imgLike Wilmington Apple on Facebook. Follow Wilmington Apple on Twitter. Follow Wilmington Apple on Instagram. Subscribe to Wilmington Apple’s daily email newsletter HERE. Got a comment, question, photo, press release, or news tip? Email wilmingtonapple@gmail.com.Share this:TwitterFacebookLike this:Like Loading… RelatedWilmington Boy Scouts Troop 136 To Hold Car Wash On September 8 To Benefit McLaren Family Following TragedyIn “Community”Cub Scout Pack 136 Announce Pumpkin Sale, Haunted Trail Walk & Haunted House FundraisersIn “Community”Wilmington Boy Scouts To Hold Car Wash On May 19In “Community” WILMINGTON, MA — Wilmington’s Boy Scout Troop 136 held a Car Wash fundraiser on Saturday, September at the Friendship Lodge.All proceeds benefited the McLaren family of Wilmington. On July 30, Tom McLaren was injured in a motorcycle accident on his way to work on Route 128. He remains hospitalized with serious injuries. Tom’s family includes his wife Wanda, and his two children, Leila and David. David is a member of Troop 136.For those who were out of town and unable to attend, a GoFundMe was set up by friends of the family at http://www.gofundme.com/support-for-the-mclaren-family. Any and all donations are greatly appreciated and will go directly to the family for living and medical expenses.Below are photos of the event, from Dick Searfoss, posted on the Friendship Lodge’s website:last_img read more

Cairn IndiaVedanta merger may be delayed by at least 3 months

first_imgMetals and mining major Vedanta’s proposed deal to merge with subsidiary Cairn India would take at least another quarter to be completed.On 14 June last year, Vedanta had announced its plans to merge with Cairn India in a deal worth $2.3 billion. Anil Agarwal-led Vedanta Resources had said in November last year that the merger would be completed by 2016’s April-June quarter. But, it has now been disclosed that the merger will be delayed further by “at least a quarter”.”The management has given to understand that the merger is running behind schedule and the companies are still awaiting a date from the high court to convene a meeting of shareholders,” The Financial Express quoted an analyst, who participated in the conference call hosted by Cairn India last Friday after its third quarter results, as saying.The shareholders’ meeting after the court’s order will be the “most crucial” for the merger to go ahead, according to another analyst who is closely tracking the transaction.For the merger to sail through, a majority of shareholders have to vote in favour of the transaction.”With regard to proposed merger with Vedanta Limited, the company is seeking directions of the Bombay High Court for convening meeting of all our relevant stakeholders,” Cairn India said in its statement on 22 January.Cairn India had given a $1.25 billion loan to Vedanta in July 2014 at below market rates that led to a sharp fall in investors’ wealth. Initially, Life Insurance Corporation (LIC) — a major stakeholder in the oil and gas explorer — had objected to such an unhealthy move, but did not take any action.In September last year, the stock exchanges had given “no objection” for the merger.”The potential merger with parent Vedanta remains a key concern. The proposed swap ratio at Vedanta’s current share price implies value for Cairn of only Rs 84/share, implying 30% downside potential to the current price,” Nomura said in a note to its clients last week.last_img read more

Why TensorFlow always tops machine learning and artificial intelligence tool surveys

first_imgTensorFlow is an open source machine learning framework for carrying out high-performance numerical computations. It provides excellent architecture support which allows easy deployment of computations across a variety of platforms ranging from desktops to clusters of servers, mobiles, and edge devices. Have you ever thought, why TensorFlow has become so popular in such a short span of time? What made TensorFlow so special, that we seeing a huge surge of developers and researchers opting for the TensorFlow framework? Interestingly, when it comes to artificial intelligence frameworks showdown, you will find TensorFlow emerging as a clear winner most of the time. The major credit goes to the soaring popularity and contributions across various forums such as GitHub, Stack Overflow, and Quora. The fact is, TensorFlow is being used in over 6000 open source repositories showing their roots in many real-world research and applications. How TensorFlow came to be The library was developed by a group of researchers and engineers from the Google Brain team within Google AI organization. They wanted a library that provides strong support for machine learning and deep learning and advanced numerical computations across different scientific domains. Since the time Google open sourced its machine learning framework in 2015, TensorFlow has grown in popularity with more than 1500 projects mentions on GitHub. The constant updates made to the TensorFlow ecosystem is the real cherry on the cake. This has ensured all the new challenges developers and researchers face are addressed, thus easing the complex computations and providing newer features, promises, and performance improvements with the support of high-level APIs. By open sourcing the library, the Google research team have received all the benefits from a huge set of contributors outside their existing core team. Their idea was to make TensorFlow popular by open sourcing it, thus making sure all new research ideas are implemented in TensorFlow first allowing Google to productize those ideas. Read Also: 6 reasons why Google open sourced TensorFlow What makes TensorFlow different from the rest? With more and more research and real-life use cases going mainstream, we can see a big trend among programmers, and developers flocking towards the tool called TensorFlow. The popularity for TensorFlow is quite evident, with big names adopting TensorFlow for carrying out artificial intelligence tasks. Many popular companies such as NVIDIA, Twitter, Snapchat, Uber and more are using TensorFlow for all their major operations and research areas. On one hand, someone can make a case that TensorFlow’s popularity is based on its origins/legacy. TensorFlow being developed under the house of “Google” enjoys the reputation of the household name. There’s no doubt, TensorFlow has been better marketed than some of its competitors. Source: The Data Incubator However that’s not the full story. There are many other compelling reasons why small scale to large scale companies prefer using TensorFlow over other machine learning tools TensorFlow key functionalities TensorFlow provides an accessible and readable syntax which is essential for making these programming resources easier to use. The complex syntax is the last thing developers need to know given machine learning’s advanced nature. TensorFlow provides excellent functionalities and services when compared to other popular deep learning frameworks. These high-level operations are essential for carrying out complex parallel computations and for building advanced neural network models. TensorFlow is a low-level library which provides more flexibility. Thus you can define your own functionalities or services for your models. This is a very important parameter for researchers because it allows them to change the model based on changing user requirements. TensorFlow provides more network control. Thus allowing developers and researchers to understand how operations are implemented across the network. They can always keep track of new changes done over time. Distributed training The trend of distributed deep learning began in 2017, when Facebook released a paper showing a set of methods to reduce the training time of a convolutional neural network model. The test was done on RESNET-50 model on ImageNet dataset which took one hour to train instead of two weeks. 256 GPUs spread over 32 servers were used. This revolutionary test has open the gates for many research work which have massively reduced the experimentation time by running many tasks in parallel on multiple GPUs. Google’s distributed TensorFlow has allowed all the researchers and developers to scale out complex distributed training using in-built methods and operations that optimizes distributed deep learning among servers. . Google’s distributed TensorFlow engine which is part of the regular TensorFlow repo, works exceptionally well with the existing TensorFlow’s operations and functionalities. It has allowed exploring two of the most important distributed methods: Distribute the training time of a neural network model over many servers to reduce the training time. Searching for good hyperparameters by running parallel experiments over multiple servers. Google has given distributed TensorFlow engine the required power to steal the share of the market acquired by other distributed projects such as Microsoft’s CNTK, AMPLab’s SparkNet, and CaffeOnSpark. Even though the competition is tough, Google has still managed to become more popular when compared to the other alternatives in the market. From research to production Google has, in some ways, democratized deep learning., The key reason is TensorFlow’s high-level APIs making deep learning accessible to everyone. TensorFlow provides pre-built functions and advanced operations to ease the task of building different neural network models. It provides the required infrastructure and hardware which makes them one of the leading libraries used extensively by researchers and students in the deep learning domain. In addition to research tools, TensorFlow extends the services by bringing the model in production using TensorFlow Serving. It is specifically designed for production environments, which provides a flexible, high-performance serving system for machine learning models. It provides all the functionalities and operations which makes it easy to deploy new algorithms and experiments as per changing requirements and preferences. It provides an excellent feature of out-of-the-box integration with TensorFlow models which can be easily extended to serve other types of models and data. TensorFlow’s API is a complete package which is easier to use and read, plus provides helpful operators, debugging and monitoring tools, and deployment features. This has led to growing use of TensorFlow library as a complete package within the ecosystem by the emerging body of students, researchers, developers, production engineers from various fields who are gravitating towards artificial intelligence. There is a TensorFlow for web, mobile, edge, embedded and more TensorFlow provides a range of services and modules within their existing ecosystem making them as one of the ground-breaking end-to-end tools to provide state-of-the-art deep learning. TensorFlow.js for machine learning on the web JavaScript library for training and deploying machine learning models in the browser. This library provides flexible and intuitive APIs to build and train new and pre-existing models from scratch right in the browser or under Node.js. TensorFlow Lite for mobile and embedded ML It is a TensorFlow lightweight solution used for mobile and embedded devices. It is fast since it enables on-device machine learning inference with low latency. It supports hardware acceleration with the Android Neural Networks API. The future releases of TensorFlow Lite will bring more built-in operators, performance improvements, and will support more models to simplify the developer’s experience of bringing machine learning services within mobile devices. TensorFlow Hub for reusable machine learning A library which is used extensively to reuse machine learning models. Thus you can transfer learning by reusing parts of machine learning models. TensorBoard for visual debugging While training a complex neural network model, the computations you use in TensorFlow can be very confusing. TensorBoard makes it very easy to understand and debug your TensorFlow programs in the form of visualizations. It allows you to easily inspect and understand your TensorFlow runs and graphs. Sonnet Sonnet is a DeepMind library which is built on top of TensorFlow extensively used to build complex neural network models. All of this factors have made the TensorFlow library immensely appealing for building a wide spectrum of machine learning and deep learning projects. This tool has become a preferred choice for everyone from space research giant NASA and other confidential government agencies, to an impressive roster of private sector giants. Road Ahead for TensorFlow TensorFlow no doubt is better marketed compared to the other deep learning frameworks. The community appears to be moving very fast. In any given hour, there are approximately 10 people around the world contributing or improving the TensorFlow project on GitHub. TensorFlow dominates the field with the largest active community. It will be interesting to see what new advances TensorFlow and other utilities make possible for the future of our digital world. Continuing the recent trend of rapid updates, the TensorFlow team is making sure they address all the current and active challenges faced by the contributors and the developers while building machine learning and deep learning models. TensorFlow 2.0 will be a major update, we can expect the release candidate by next year early March. The preview version of this major milestone is expected to hit later this year. The major focus will be on ease of use, additional support for more platforms and languages, and eager execution will be the central feature of TensorFlow 2.0. This breakthrough version will add more functionalities and operations to handle current research areas such as reinforcement learning, GANs, building advanced neural network models more efficiently. Google will continue to invest and upgrade their existing TensorFlow ecosystem. According to Google’s CEO, Sundar Pichai “artificial intelligence is more important than electricity or fire.” TensorFlow is the solution they have come up with to bring artificial intelligence into reality and provide a stepping stone to revolutionize humankind. Read more The 5 biggest announcements from TensorFlow Developer Summit 2018 The Deep Learning Framework Showdown: TensorFlow vs CNTK Tensor Processing Unit (TPU) 3.0: Google’s answer to cloud-ready Artificial Intelligencelast_img read more