Translate into a different language

Sunday, April 22, 2018

3 Benefits of Online Programs at Community Colleges | Education - U.S. News & World Report

"Access to a physical campus is among the advantages of attending a local community college as an online student" reports Bradley Fuster, associate vice president of institutional effectiveness at SUNY Buffalo State.

Many community colleges offer seamless transfers to four-year colleges.
Photo: Maskot/Getty Images

Since the humble beginnings of American community colleges in 1901, these institutions have typically focused on meeting the academic, professional and vocational needs of nontraditional, financially constrained students bound to a certain location.

Community colleges have been successful, growing undergraduate enrollment faster than their four-year public, four-year private and for-profit competitors from 2000 to 2015.

Findings from the National Center for Education Statistics show that in fall 2014, of 6.4 million students attending public community colleges, 1.8 million students enrolled in at least one online course, with 690,000 students attending community colleges completely online.

Over time, more online programs and degrees have become available. For example, Wake Technical Community College in Raleigh, North Carolina, offers more than 100 online programs.

While community colleges may not have the reputational prestige of large, national online colleges, they generally have a much lower sticker price. Here are three additional advantages of attending an online program at a public community college. 
Read more...

Source: U.S. News & World Report


If you enjoyed this post, make sure you subscribe to my Email Updates!

NASA's women scientists rank space movies from worst to best - the list | Science - International Business Times, India Edition

The story is based on a BBC report on NASA's women and their take on films that project their profession.


Follow on Twitter as @JothamManny 
"Scientists at NASA are fans of film, more so when they are based in space - here is a ranking of what real space engineers think of space movies" according to Immanuel Jotham, Science and Technology reporter at International Business Times.

Photo: International Business Times

Scientists at NASA are a big fan of movies, especially space movies, however, not all movies based in space strike a chord with actual engineers who work and develop the technology that will one day take humans into the great beyond. Here is a list of movies that a team of women NASA employees thinks of space movies - ranked from worst to best.
Read more... 

Source: International Business Times, India Edition


If you enjoyed this post, make sure you subscribe to my Email Updates!

Saturday, April 21, 2018

Tour the Space Station in VR with This Amazing 3D, 360-Degree Video | Tech - Space.com

Follow on Twitter as @HarrisonTasoff
"The National Geographic Channel has revealed the first 3D, 360-degree video of space as a part of its new documentary series "One Strange Rock" reports 

We took a virtual tour with the astronauts aboard the International Space Station while hearing their thoughts on the enormity of space, and it left us speechless.

First-Ever 3D VR Filmed in Space | One Strange Rock


A special delivery arrived at the space station last November: a state-of-the-art Vuze VR camera. European Space Agency astronaut Paolo Nespoli brought the camera with him during his daily routine on the station. Nespoli received unique training on the device from series filmmaker Darren Aronofsky himself, who gave the Italian astronaut a crash course in VR filming via Skype. To experience the full impact of the video, watch it on your smartphone while wearing your favorite VR headset.

The video begins in low Earth orbit. An instrumental prelude plays as the space station approaches. The welcoming voice of retired Canadian astronaut Chris Hadfield relates how his 166 days in space changed his world view, both literally and metaphorically. Hadfield is soon joined by former NASA astronauts Mae Jemison, Mike Massimino and Nicole Stott, all of whom discuss their experience of Earth from the rarified vantage point of the space station. [The International Space Station: Inside and Out (Infographic)]

Nespoli carries the trusty camera through the tight quarters of the space station, providing viewers with a 360-degree perspective of life aboard the outpost in the sky. Wires, fixtures and equipment cover nearly every surface of the cabins, but that's nary a problem when you can float past them in microgravity. Nespoli also recorded super-high-definition footage of NASA astronaut Peggy Whitson, the first woman to command the space station, including her final day in space at the end of Expedition 52. The sequences will appear in the series' final episode, which airs on Monday, May 28 at 10 p.m. EDT/9 p.m. CDT.
Read more...

Source: Space.com and National Geographic Channel (YouTube)


If you enjoyed this post, make sure you subscribe to my Email Updates!

Seven Artificial Intelligence Advances Expected This Year | Technology - Forbes

"Artificial intelligence (AI) has had a variety of targeted uses in the past several years, including self-driving cars" continues Forbes Technology Council.

Photo: Shutterstock

Recently, California changed the law that required driverless cars to have a safety driver. Now that AI is getting better and able to work more independently, what's next?
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
We asked seven technology experts from the Forbes Technology Council their thoughts on the advances and implementations in AI that they expect to see in the year ahead. All the responses touched on how AI can help humans now, instead of much further down the road. This is what they had to say.

1. Improved Patient Health Outcomes 

I expect that we will see an increased focus on improving health outcomes utilizing artificial intelligence. Patients are producing significant amounts of health data with mobile devices and connected wearables. Providers are using electronic health records generating enormous amounts of information. Applying artificial intelligence will utilize information from patients and providers to actively identify health conditions that may not have been detected until later. - Meghann Chilcott, OrderInsite, LLC
Read more...  

Source: Forbes  


If you enjoyed this post, make sure you subscribe to my Email Updates!

What You Need to Know About Artificial Intelligence | Parade

Photo: Kathleen McCleary
Kathleen McCleary, Contributor says, "Artificial intelligence, the top job trend, is here to stay and it’s changing the face of work."
 
Photo: iStock
If you’ve recently chatted online with customer service, had an X-ray taken or applied for a loan, you’ve likely experienced A.I., including “chatbots,” diagnostic imaging machines and loan algorithms. But the new wave of technology doesn’t necessarily mean unemployment.

“People fear a lot of jobs will be destroyed, but the reality is jobs will change as people team up with technology,” says Andrew Chamberlain, Ph.D., chief economist with job search website Glassdoor. A recent report by McKinsey Global Institute (MGI) found that up to 32 percent of the U.S. workforce (166 million people) will have to move out of their current occupational categories to find work over the next 12 years, but they’ll be taking on different jobs, including some that never existed before.

“Everybody’s job is going to look different by 2030,” says Susan Lund, partner with MGI and an author of the report. Think back to 1980, before personal computers and the internet. PCs have created 19.5 million new jobs in the U.S., from software developers to semiconductor manufacturers. At the same time, 3.5 million jobs have dried up, including typists, secretaries and typewriter manufacturers. Still, the economy has gained 16 million jobs over the past 35 years, thanks to new technology...

Natalie Choate, director of media relations and partnerships for the Texas Tribune, worked for the organization for more than five years in fundraising and membership before moving over to her current job. “I was stepping out into the unknown,” she says. “It was a completely different job—and different jargon.” The Tribune was willing to invest in the time it took her to learn her new gig in order to keep her on staff. “I have very patient co-workers who helped me get from point A to point B,” she says.
Read more...  

Source: Parade


If you enjoyed this post, make sure you subscribe to my Email Updates!

Changing the game: Machine learning in healthcare | Healthcare IT News

"When EHRs can learn – gather and remember – what works best for each user, they can attain maximum efficiency" according to Paul Black, CEO of Allscripts.
 

Photo: Healthcare IT News (blog)

As we live in the new world of quality, value-based care, we must be able to draw more insights and conclusions from ever-increasing amounts of information. We have the data, now we must put it to work. When we combine all of this data with machine learning, we are equipped to make smarter decisions. We have the power to transform healthcare – from the way we use electronic health records to the way we predict and deliver care.  

A game changer for EHRs 
Most EHRs are built on technology that is 20 or 30 years old. Generally, EHRs have kept up with rapid changes in healthcare by making incremental improvements over time. But it is challenging to retrofit EHRs to take full advantage of new innovations.

EHRs must do more than store data. They should be smart enough to deliver the right information at the right time, at the point of care. When an EHR is powered by machine learning, it can pre-populate information based on usage patterns and deliver preference reminders, constantly surveilling trends by user and organization to create opportunities for more effective care...

A game changer for population health, predictive modeling 
Machine learning is also empowering us to analyze patient data at a level never before possible. We can now transform data into insights and actionable information.

Just think how a "data lake," where we are able to store millions of de-identified patient information to structure and to analyze data and study problems that are meaningful to health care, could transform diabetes care, for example.

We now have the power to compare things like blood sugar levels, body mass index, age and other risk factors and analyze treatment outcomes...

The way of the future 
...extraordinarily exciting set of capabilities today that didn't exist a decade ago. It enables computers to handle greater amounts of work than human beings can undertake, and will become increasingly important in this era of consumerization...
Read more...

Source: Healthcare IT News (blog)


If you enjoyed this post, make sure you subscribe to my Email Updates!

Friday, April 20, 2018

Machine-learning system processes sounds like humans do | MIT News

"Neuroscientists train a deep neural network to analyze speech and music" says Anne Trafton, MIT News Office.

MIT neuroscientists have developed a machine-learning system that can process speech and music the same way that humans do.
Photo: Chelsea Turner/MIT
Using a machine-learning system known as a deep neural network, MIT researchers have created the first model that can replicate human performance on auditory tasks such as identifying a musical genre.

This model, which consists of many layers of information-processing units that can be trained on huge volumes of data to perform specific tasks, was used by the researchers to shed light on how the human brain may be performing the same tasks.

“What these models give us, for the first time, is machine systems that can perform sensory tasks that matter to humans and that do so at human levels,” says Josh McDermott, the Frederick A. and Carole J. Middleton Assistant Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT and the senior author of the study. “Historically, this type of sensory processing has been difficult to understand, in part because we haven’t really had a very clear theoretical foundation and a good way to develop models of what might be going on.”

The study, which appears in the April 19 issue of Neuron, also offers evidence that the human auditory cortex is arranged in a hierarchical organization, much like the visual cortex. In this type of arrangement, sensory information passes through successive stages of processing, with basic information processed earlier and more advanced features such as word meaning extracted in later stages.

MIT graduate student Alexander Kell and Stanford University Assistant Professor Daniel Yamins are the paper’s lead authors. Other authors are former MIT visiting student Erica Shook and former MIT postdoc Sam Norman-Haignere. 

Modeling the brain
When deep neural networks were first developed in the 1980s, neuroscientists hoped that such systems could be used to model the human brain. However, computers from that era were not powerful enough to build models large enough to perform real-world tasks such as object recognition or speech recognition.

Over the past five years, advances in computing power and neural network technology have made it possible to use neural networks to perform difficult real-world tasks, and they have become the standard approach in many engineering applications. In parallel, some neuroscientists have revisited the possibility that these systems might be used to model the human brain.
Read more... 

Journal Reference:
Journal Alexander J.E. Kell, Daniel L.K. Yamins, Erica N. Shook, Sam V. Norman-Haignere, Josh H. McDermott. A Task-Optimized Neural Network Replicates Human Auditory Behavior, Predicts Brain Responses, and Reveals a Cortical Processing Hierarchy. Neuron, 2018; DOI: 10.1016/j.neuron.2018.03.044

Source: MIT News


If you enjoyed this post, make sure you subscribe to my Email Updates!

Tensorflow with Javascript Brings Deep Learning to the Browser | InfoQ.com

Alexis Perrier, Data Scientist inform, "At the recent TensorFlow Dev Summit 2018, Google announced the release of Tensorflow.js, a Javascript implementation of Tensorflow, its open-source deep-learning framework. Tensorflow.js allows training models directly in the browser by leveraging the WebGL JavaScript API for faster computations."

Machine Learning in JavaScript (TensorFlow Dev Summit 2018)
 


Tenforflow.js is an evolution of deeplearn.js, a Javascript library released by Google in August 2017. Deeplearn.js was born out of the success of the Tensorflow Playground, an interactive visualization of neural networks written in TypeScript.

Tensorflow.js has four layers: The WebGL API for GPU-supported numerical operations, the web browser for user interactions, and two APIs: Core and Layers. The low-level Core API corresponds to the former deeplearn.js library. It provides hardware-accelerated linear algebra operations and an eager API for automatic differentiation. The higher-level Layers API is used to build machine-learning models on top of Core. The Layers API is modeled after Keras and implements similar functionality. It also allows to import models previously trained in python with Keras or TensorFlow SavedModels and use it for inference or transfer learning in the browser. 

With Tensorflow.js, machine-learning models can be utilized in the browser in three ways: by importing already pre-trained models and using them for inference only, by training models from scratch directly in the browser, or by using transfer learning to first adapt imported models to the user's context and then use these improved models for inference. 

As Nikhil Thorat and Daniel Smilkov, members of the Tensorflow team, point out in their announcement video, (see in the top of the post) running Tensorflow in the browser has several advantages: the infrastructure and set of requirements are simplified as the need for background API requests is removed; the available data is richer in nature thanks to newly accessible sensors, such as webcam and microphone on computers or GPS and gyroscope on mobile devices; and the data also remains on the client side which addresses privacy concerns. 
Read more...

Source: InfoQ.com and TensorFlow Channel (YouTube)


If you enjoyed this post, make sure you subscribe to my Email Updates!

What Is Deep Learning and How Does it Relate to AI? | CMSWire

Photo: Erika Morphy
"Google’s AlphaGo made history in May 2017 when it defeated Ke Jie, the world’s reigning champion of the ancient Chinese game Go" summarizes Erika Morphy, New Orleans-based journalist.

Photo: Vlad Tchompalov

It was the first computer program to defeat a professional human Go player, much less a world champion. Later that year, Google introduced AlphaGo Zero, an even more powerful iteration of AlphaGo.

Anyone wanting to understand the difference between artificial intelligence and deep learning can start by understanding the difference between AlphaGo and AlphaGo Zero. With AlphaGo, Google trained the original AlphaGo to play by teaching it to look at data from the top players, said Avi Reichental, CEO of XponentialWorks. Within a short period of time it was able to beat almost all standing champions hands down, he said. But with AlphaGo Zero, instead of having an algorithm look at lots of data from other players, Google taught the system the rules of the game and let the algorithm learn how to improve on its own, Reichental said. The end result, he said, is a computational power unparalleled in speed and intelligence.

Without a doubt artificial intelligence is becoming more common in our daily and business lives. It is making appearances in voice assistants and chatbots, as well as in complex business applications. As it does, it is important to learn to distinguish among the different types of AI, such as deep learning.

Defining AI and Its Many Iterations 
Starting with the basics, AI is a concept of getting a computer or machine or robot to do what previously only humans could do, said Mark Stadtmueller, VP of Product Strategy at Lucd. Machine learning is a type of AI where algorithms are used to analyze data, he continued. “Machine learning analysis involves looking for patterns within the data and creating and refining a model/equation that best approximates the data pattern. With this model/equation, predictions can be made on new data that follows that data pattern.”

Neural networks are a type of machine learning in which brain neuron behavior is approximated to model many input values to determine or predict an outcome, Stadtmueller said. When many layers of neurons are used, it is called a deep neural network. “Deep neural networks have been very successful in improving the accuracy of speech recognition, computer vision, natural language processing and other predictive capabilities,” he said. When using deep neural networks, people refer to it as deep learning, Stadtmueller said. “So deep learning is the act of using a deep neural network to perform machine learning, which is a type of AI.”
Read more...

Source: CMSWire


If you enjoyed this post, make sure you subscribe to my Email Updates!

Thursday, April 19, 2018

Do You Really Need An MBA? | Career Advice - Refinery29

Photo: Judith Ohikuare
Mark Zuckerberg being hauled before Congress is a signal to some people that at 33 years old, the tech founder is finally "growing up." So, maybe it's also time to retire the idea successful entrepreneurs should drop out of college to get ahead. After all, companies like  Facebook are called "unicorns" for a reason.) says Judith Ohikuare, Work & Money Writer at Refinery29.

Photo: Christy Kurtz..
Those of us who didn’t start a multimillion dollar company in our dorm rooms have to consider other paths to becoming business leaders. But it's wise to think twice before spending tens, if not hundreds, of thousands of dollars on a master's in business, but it's also important to remember that an MBA isn't meant to be a prerequisite for starting a business at all. The 2018 Alumni Perspectives Survey from the Graduate Management Admission Council (GMAC) found that 79% of b-school alumni worked for another company and 10% were self-employed. So, this degree usually comes in handy for people who have dreams of being high-level business executives in a variety of industries.

If you're toying with the idea of getting an MBA but aren't sure if the time and financial commitment are worth it, here are some things to consider.

What It Is For?
Writing in Harvard Business Review a few years ago, executive coach Ed Batista said the three key uses for an MBA were: practical leadership and management skills, a job marketplace credential, and access to a vast alumni network. Those may sound like abstract rewards but in some circumstances, they can reap concrete benefits: MBA alums can generally expect higher salaries than those other grads, with a median base salary of $115,000, depending on job level and location.

How Much Will It Cost? 
There's no point being delicate about it: MBAs are expensive.

To attend a program at a school like Stanford, you can expect to exceed the $100,00 mark over your two years there. Financial aid, in the form of loans and fellowships, is available, but how much you get of either naturally depends on your individual assets. "On average, people receive $36,000 or $37,000 in fellowships per year. But that average is pretty meaningless because some people get full rides, and others get zero," Khan says. "If you work in a pretty low-income job, you're going to get a much higher financial aid package in terms of the fellowship proportion to loans."...

Is There Another Way In?
The gospel of education can sometimes make it seem like getting any degree, and as many as possible, is necessary to advance. But there's no point in wasting your time and money on a costly credential that reaps little benefits. Khan says online MBA programs, an increasingly popular option, may be worthwhile for people who want to gain exposure to specific subject matter — like accounting, for example. (Just do your homework on online programs especially.) On the other hand, if you simply want to learn a skill, you can also enroll in an accounting class without going the full-on school route.
Read more...

Source: Refinery29


If you enjoyed this post, make sure you subscribe to my Email Updates!