Translate into a different language

Friday, April 20, 2018

Machine-learning system processes sounds like humans do | MIT News

"Neuroscientists train a deep neural network to analyze speech and music" says Anne Trafton, MIT News Office.

MIT neuroscientists have developed a machine-learning system that can process speech and music the same way that humans do.
Photo: Chelsea Turner/MIT
Using a machine-learning system known as a deep neural network, MIT researchers have created the first model that can replicate human performance on auditory tasks such as identifying a musical genre.

This model, which consists of many layers of information-processing units that can be trained on huge volumes of data to perform specific tasks, was used by the researchers to shed light on how the human brain may be performing the same tasks.

“What these models give us, for the first time, is machine systems that can perform sensory tasks that matter to humans and that do so at human levels,” says Josh McDermott, the Frederick A. and Carole J. Middleton Assistant Professor of Neuroscience in the Department of Brain and Cognitive Sciences at MIT and the senior author of the study. “Historically, this type of sensory processing has been difficult to understand, in part because we haven’t really had a very clear theoretical foundation and a good way to develop models of what might be going on.”

The study, which appears in the April 19 issue of Neuron, also offers evidence that the human auditory cortex is arranged in a hierarchical organization, much like the visual cortex. In this type of arrangement, sensory information passes through successive stages of processing, with basic information processed earlier and more advanced features such as word meaning extracted in later stages.

MIT graduate student Alexander Kell and Stanford University Assistant Professor Daniel Yamins are the paper’s lead authors. Other authors are former MIT visiting student Erica Shook and former MIT postdoc Sam Norman-Haignere. 

Modeling the brain
When deep neural networks were first developed in the 1980s, neuroscientists hoped that such systems could be used to model the human brain. However, computers from that era were not powerful enough to build models large enough to perform real-world tasks such as object recognition or speech recognition.

Over the past five years, advances in computing power and neural network technology have made it possible to use neural networks to perform difficult real-world tasks, and they have become the standard approach in many engineering applications. In parallel, some neuroscientists have revisited the possibility that these systems might be used to model the human brain.
Read more... 

Journal Reference:
Journal Alexander J.E. Kell, Daniel L.K. Yamins, Erica N. Shook, Sam V. Norman-Haignere, Josh H. McDermott. A Task-Optimized Neural Network Replicates Human Auditory Behavior, Predicts Brain Responses, and Reveals a Cortical Processing Hierarchy. Neuron, 2018; DOI: 10.1016/j.neuron.2018.03.044

Source: MIT News

If you enjoyed this post, make sure you subscribe to my Email Updates!

Tensorflow with Javascript Brings Deep Learning to the Browser |

Alexis Perrier, Data Scientist inform, "At the recent TensorFlow Dev Summit 2018, Google announced the release of Tensorflow.js, a Javascript implementation of Tensorflow, its open-source deep-learning framework. Tensorflow.js allows training models directly in the browser by leveraging the WebGL JavaScript API for faster computations."

Machine Learning in JavaScript (TensorFlow Dev Summit 2018)

Tenforflow.js is an evolution of deeplearn.js, a Javascript library released by Google in August 2017. Deeplearn.js was born out of the success of the Tensorflow Playground, an interactive visualization of neural networks written in TypeScript.

Tensorflow.js has four layers: The WebGL API for GPU-supported numerical operations, the web browser for user interactions, and two APIs: Core and Layers. The low-level Core API corresponds to the former deeplearn.js library. It provides hardware-accelerated linear algebra operations and an eager API for automatic differentiation. The higher-level Layers API is used to build machine-learning models on top of Core. The Layers API is modeled after Keras and implements similar functionality. It also allows to import models previously trained in python with Keras or TensorFlow SavedModels and use it for inference or transfer learning in the browser. 

With Tensorflow.js, machine-learning models can be utilized in the browser in three ways: by importing already pre-trained models and using them for inference only, by training models from scratch directly in the browser, or by using transfer learning to first adapt imported models to the user's context and then use these improved models for inference. 

As Nikhil Thorat and Daniel Smilkov, members of the Tensorflow team, point out in their announcement video, (see in the top of the post) running Tensorflow in the browser has several advantages: the infrastructure and set of requirements are simplified as the need for background API requests is removed; the available data is richer in nature thanks to newly accessible sensors, such as webcam and microphone on computers or GPS and gyroscope on mobile devices; and the data also remains on the client side which addresses privacy concerns. 

Source: and TensorFlow Channel (YouTube)

If you enjoyed this post, make sure you subscribe to my Email Updates!

What Is Deep Learning and How Does it Relate to AI? | CMSWire

Photo: Erika Morphy
"Google’s AlphaGo made history in May 2017 when it defeated Ke Jie, the world’s reigning champion of the ancient Chinese game Go" summarizes Erika Morphy, New Orleans-based journalist.

Photo: Vlad Tchompalov

It was the first computer program to defeat a professional human Go player, much less a world champion. Later that year, Google introduced AlphaGo Zero, an even more powerful iteration of AlphaGo.

Anyone wanting to understand the difference between artificial intelligence and deep learning can start by understanding the difference between AlphaGo and AlphaGo Zero. With AlphaGo, Google trained the original AlphaGo to play by teaching it to look at data from the top players, said Avi Reichental, CEO of XponentialWorks. Within a short period of time it was able to beat almost all standing champions hands down, he said. But with AlphaGo Zero, instead of having an algorithm look at lots of data from other players, Google taught the system the rules of the game and let the algorithm learn how to improve on its own, Reichental said. The end result, he said, is a computational power unparalleled in speed and intelligence.

Without a doubt artificial intelligence is becoming more common in our daily and business lives. It is making appearances in voice assistants and chatbots, as well as in complex business applications. As it does, it is important to learn to distinguish among the different types of AI, such as deep learning.

Defining AI and Its Many Iterations 
Starting with the basics, AI is a concept of getting a computer or machine or robot to do what previously only humans could do, said Mark Stadtmueller, VP of Product Strategy at Lucd. Machine learning is a type of AI where algorithms are used to analyze data, he continued. “Machine learning analysis involves looking for patterns within the data and creating and refining a model/equation that best approximates the data pattern. With this model/equation, predictions can be made on new data that follows that data pattern.”

Neural networks are a type of machine learning in which brain neuron behavior is approximated to model many input values to determine or predict an outcome, Stadtmueller said. When many layers of neurons are used, it is called a deep neural network. “Deep neural networks have been very successful in improving the accuracy of speech recognition, computer vision, natural language processing and other predictive capabilities,” he said. When using deep neural networks, people refer to it as deep learning, Stadtmueller said. “So deep learning is the act of using a deep neural network to perform machine learning, which is a type of AI.”

Source: CMSWire

If you enjoyed this post, make sure you subscribe to my Email Updates!

Thursday, April 19, 2018

Do You Really Need An MBA? | Career Advice - Refinery29

Photo: Judith Ohikuare
Mark Zuckerberg being hauled before Congress is a signal to some people that at 33 years old, the tech founder is finally "growing up." So, maybe it's also time to retire the idea successful entrepreneurs should drop out of college to get ahead. After all, companies like  Facebook are called "unicorns" for a reason.) says Judith Ohikuare, Work & Money Writer at Refinery29.

Photo: Christy Kurtz..
Those of us who didn’t start a multimillion dollar company in our dorm rooms have to consider other paths to becoming business leaders. But it's wise to think twice before spending tens, if not hundreds, of thousands of dollars on a master's in business, but it's also important to remember that an MBA isn't meant to be a prerequisite for starting a business at all. The 2018 Alumni Perspectives Survey from the Graduate Management Admission Council (GMAC) found that 79% of b-school alumni worked for another company and 10% were self-employed. So, this degree usually comes in handy for people who have dreams of being high-level business executives in a variety of industries.

If you're toying with the idea of getting an MBA but aren't sure if the time and financial commitment are worth it, here are some things to consider.

What It Is For?
Writing in Harvard Business Review a few years ago, executive coach Ed Batista said the three key uses for an MBA were: practical leadership and management skills, a job marketplace credential, and access to a vast alumni network. Those may sound like abstract rewards but in some circumstances, they can reap concrete benefits: MBA alums can generally expect higher salaries than those other grads, with a median base salary of $115,000, depending on job level and location.

How Much Will It Cost? 
There's no point being delicate about it: MBAs are expensive.

To attend a program at a school like Stanford, you can expect to exceed the $100,00 mark over your two years there. Financial aid, in the form of loans and fellowships, is available, but how much you get of either naturally depends on your individual assets. "On average, people receive $36,000 or $37,000 in fellowships per year. But that average is pretty meaningless because some people get full rides, and others get zero," Khan says. "If you work in a pretty low-income job, you're going to get a much higher financial aid package in terms of the fellowship proportion to loans."...

Is There Another Way In?
The gospel of education can sometimes make it seem like getting any degree, and as many as possible, is necessary to advance. But there's no point in wasting your time and money on a costly credential that reaps little benefits. Khan says online MBA programs, an increasingly popular option, may be worthwhile for people who want to gain exposure to specific subject matter — like accounting, for example. (Just do your homework on online programs especially.) On the other hand, if you simply want to learn a skill, you can also enroll in an accounting class without going the full-on school route.

Source: Refinery29

If you enjoyed this post, make sure you subscribe to my Email Updates!

Lack of security skills has become a drag on Australia’s digital transformation | CSO Australia

Photo: David Braue, 
Follow on LinkedIn | Twitter | Blog
"A lack of cybersecurity skills has forced more than half of Australian IT decision-makers to slow down their cloud rollouts, according to new research that has redoubled the urgency of strategies for building and deploying Australia’s cybersecurity capabilities" writes (CSO Online).
Photo: CSO Australia

As new initiatives woo cybersecurity talent, Australia’s cybersecurity workforce is falling behind global benchmarks – and cloud-first initiatives are suffering

A lack of cybersecurity skills has forced more than half of Australian IT decision-makers to slow down their cloud rollouts, according to new research that has redoubled the urgency of strategies for building and deploying Australia’s cybersecurity capabilities.

The rush to the cloud was slowing across the board, according to a new McAfee survey of 1400 IT decision-makers that found the proportion of businesses with cloud-first strategies had dropped from 82 percent a year ago, to 65 percent now.

One in four companies has experienced data theft from the public cloud, while 1 in 5 said they have experienced an advanced attack against their public cloud infrastructure.

With cloud security estimated to rise from 27 percent of IT-security budgets to 37 percent within the next 12 months, Cloud Security Business Unit senior vice president Rajiv Gupta told CSO Australia, the figures suggest that customers were learning the hard way that cloud security is harder than many companies had anticipated when they began ambitious digital-transformation efforts.

Poor visibility was flagged as a significant issue – and vendors, Gupta said, are to blame. 

“We see a plethora of vendors claiming to be best of breed, but they have laid the effort of integrating all of these products into a cohesive whole, on the feet of their customers.”

“But that is not their business; their business is producing sweaters, or cars, or managing financial instruments. We as an industry need to show that the different products we sell can exchange threat telemetry to function as a cohesive whole.”

Significantly, the problem seemed to be markedly worse in Australia, where 53 percent of respondents said problems with cloud security had forced them to slow down their cloud rollouts. This was well above the 30 percent figure in the UK, 37 percent in Canada, and 40 percent figure recorded globally – suggesting that the long-reported paucity of relevant security skills in Australia was taking its toll.

Just 10 percent of Australian companies said they do not have a cybersecurity skill shortage and are continuing with cloud adoption – well behind the 24 percent figure in the UK, 19 percent in the US and Japan, and 16 percent globally.
Read more... 

Source: CSO Australia

If you enjoyed this post, make sure you subscribe to my Email Updates!

Wednesday, April 18, 2018

This Online MBA Program Is Bringing Distance-Learning Into The Real World | MBA Distance - Learning - BusinessBecause

Photo: Amy Hughes
"Discovery learning techniques on Maastricht School of Management’s (MSM) Online MBA bring the advantages of campus-based courses to bear in the distance-learning space" reports Amy Hughes, Business Because Ltd.

Maastricht School of Managment activates MBAs' learning through discovery learning techniques
Photo: BusinessBecause 
Why do people opt for full-time MBAs?

For some, it’s the immersive experience of postgraduate education that swings their decision. For others, it’s about networking—getting to know a cohort inside and outside the classroom on a full-time course. For others still, it’s the practical experience that many courses offer: the chance to apply their learning in real-time.

But imagine if you could get all of this from a part-time course—a course that you could complete, if you wanted to, entirely online.
This is what Maastricht School of Management (MSM) aims to deliver through its Online MBA, launched in 2017. The course eliminates the opportunity cost associated with traditional MBAs as it doesn’t require students to leave work. It’s ranked by CEO magazine in the ‘Gold Tier’ of online MBA programs—4th in the world – proving the high quality of this Online MBA program. 

The Online MBA at MSM is governed by the principle of ‘Discovery Learning’ which, according to course director Dr. Pascale Hardy, encourages the students to be active in their own education. 

“Unlike a traditional course where students obtain most new knowledge directly from course instructors via lectures and textbook-based discussions and assignments,” she explains, “the courses in the online program require that students learn through discovery—by researching, reading, undertaking online activities such as writing papers, and discussion board posts that demonstrate their knowledge, understanding, and their ability to apply [these things].”

Online MBA students have the advantage of being learning while they work, and MSM’s Online MBA program actively encourages MBAs to apply the lessons they learn in their modules to their everyday working environment. 

In line with MSM’s increasing commitment to innovation, this active engagement in the collection and collation of course information is supported by the latest developments in technology.

“The Maastricht School of Management Online MBA [uses] cutting-edge technology to support an innovative pedagogical framework in order to offer students the ultimate learning experience,” Pascale confirms. “Students participate in video conferencing sessions [hosted by Zoom Web Communications], during which they review and discuss the topics with their peers and instructor.” 
Read more... 

Source: BusinessBecause

If you enjoyed this post, make sure you subscribe to my Email Updates!

If You Read One Higher Ed Book This Year, Make it 'Robot-Proof' | Inside Higher Ed - Technology and Learning

Follow on Twitter as @joshmkim
"Where we fit, and how we must change, in the age of intelligent robots" argues Dr. Joshua Kim, Director of Digital Learning Initiatives at the Dartmouth Center for the Advancement of Learning (DCAL).

Technology and Learning

It is not an accident that I’m putting my review of Robot-Proof in "Inside Digital Learning." My goal is to drive maximum awareness of this book among anyone thinking about the future of higher education.

Aoun, a linguist who also happens to head Northeastern University, makes the case that our economy is on the cusp of enormous change. He is largely supportive of the argument that accelerating improvements in the bundle of technologies that comprise artificial intelligence (sensors, processing, big data analysis) will drive fundamental changes at every level of our economy. This artificial intelligence driven shift will be as consequential as the two earlier large-scale economic shifts, that of agricultural to manufacturing and then from manufacturing to services.

Our higher education sector, according to Aoun, is not moving fast enough in the face of this large-scale economic change. Just as our colleges and universities are finally getting aligned with today’s service-based labor market, we are doing too little to prepare our students for an age of smart machines.

What is the recipe for higher education to become robot-proof? This answer will be context dependent, as every institution must build on its own strengths and traditions. Aoun does not claim to offer any algorithm for organizational change. He is sensitive to the complexity of our institutions, and is not an advocate of simplistic notions around disruption. What Aoun does forcefully for is an end to the traditional thinking which places a liberal arts education in opposition to preparation for employment. He finds that liberal arts / employment preparation dichotomy no longer accurate, or particularly useful in evolving our institutions.

We should, argues Aoun, seek to align teaching and learning at our colleges and universities with the research on learning. What this scholarship demonstrates is that there are limits to both abstract and applied learning. The two must be married. Active and experiential learning opportunities are critical components of a valuable postsecondary education.

Aoun believes that in an economy where smart robots do much of the work that people do today -- including the information-based service work (accounting, legal services, etc.) that was previously protected from automation -- that the skills prioritized in a liberal arts education will be increasingly in demand. These skills include judgment, collaboration, curiosity, communication, empathy, team work, leadership and many others. These creative, social and leadership qualities represent tasks that can’t be automated.
Read more... 

Additional resources 

Robot-Proof: Higher Education in
the Age of Artificial Intelligence (MIT Press)
Robot-Proof: Higher Education in the Age of Artificial Intelligence, by Joseph E. Aoun (MIT Press, 2017)

Source: Inside Higher Ed (blog)

If you enjoyed this post, make sure you subscribe to my Email Updates!

Time to teach insurance to computers | Digital Insurance

"Due to retirement and outsourcing, insurers are facing a steep drop in institutional knowledge about the industry, their products and processes" continues Digital Insurance.  

Photo: Digital Insurance

I’ve been noodling on an idea ever since I started working with machine learning (ML) and artificial intelligence (AI) software suites. Due to retirement and outsourcing, insurers are facing a steep drop in institutional knowledge about the industry, their products, and their processes.

This knowledge was acquired over 20-30 years through a close working relationship between the business, IT, and the core systems in use. In many cases, these details are in the heads of a select few service reps, business analysts, and programmers. The documentation may exist, but it requires a base understanding of insurance and corporate history to understand it.

When I think about the processes that led to the possession of seemingly intuitive knowledge, I conclude that it all started with a strong foundation. Many companies encouraged their employees to pursue industry designations and certifications, both for the business and technical staff. These designations are offered by LOMA, LIMRA, FINRA, and the American College. Acquiring these credentials equips the individual with a breadth and depth of knowledge about insurance products, processes, and sales. This is the building block upon which the company-specific knowledge is layered.
Read more... 

Source: Digital Insurance

If you enjoyed this post, make sure you subscribe to my Email Updates!

What is Actuarial Science? How Is It Different From Data Science? | Analytics India Magazine

Photo: Srishti Deoras
"The roots of actuarial science dates back to the time when the concept of compound interest was published by Richard Witt, and can be considered as one of the oldest professions" summarizes Srishti Deoras currently works as Sr. Content Strategist for Analytics India Magazine.

Photo: Analytics India Magazine

Used especially in underwriting the loans and in insurance earlier, actuaries have been one of the earliest users of data and generating insights from it. Though they have been there for the longest time, their demand have come up again with reports suggesting that there are now only 4% of actuaries in India compared to the roles that are available for actuaries.  

They have been preconceived to be relevant to just insurance industry (LIC, AIG etc.) and banking for long. However their work can expand to diverse industries such as banking, healthcare, pensions and benefits, asset management, capital project, investments, risks and others. They can have roles of consultants, analyst, troubleshooters, risk assessors, amongst  others in these industries.

To put it simply, an actuary specialises in evaluating financial implications of risk and uncertainty, while devising solutions to reduce chances of any future risks and occurrence of any undesirable events. Though it is a very narrow field and has a specific job destination, it is one of the best paid jobs. Numbers suggest that most of the actuaries that qualify either end up leaving India for better opportunities or doing their own business. Given the sparse number of these professionals in India, their demand is at a rise.

What are the skills required to be an actuary, how can you equip these skills?  
Being an actuary typically requires minimum of bachelor’s degree. Direct route to major in actuarial science consists of course in math, statistics, and industry related experience, however, other quantitative courses may also produce candidates for actuarial science. These majors typically include computer science, economics, mathematics, physics and statistics, humanities, English among others...

In a nutshell, some of the skills that actuaries are expected to have are: 
  • Analytical problem solving skills- Since they are tasked with examining complex data and identifying trends, analytical problem solving remains a key skill that will help them  look for ways to minimize the likelihood of undesirable outcome.
  • Computer skills- Computer along with a variety of statistical modelling software is required that will help them to evaluate large numbers of data. Undergoing courses in SAS, VBA may be required. They should be equipped with formulating spreadsheets, database manipulations, statistical analysis programs, programming languages etc.
  • Math skills- Since they deal with numbers, being quick and correct in math skills is certainly required. Knowledge in calculus, probability, statistics and others will help.
  • Understanding of business and finance- As they would find themselves being employed with insurance and financial institutions, a fair amount of understanding on how these industries work is an advantage. Good business sense will help in devising solutions for financial risk and providing expert opinion.
  • Communication skills- As mentioned earlier, since actuaries connect with various personnel including programmers, accountants, and senior management, to be able to communicate effectively is the key.

Some of the colleges in India that provide exclusive courses in actuarial science are Aligarh Muslim University, Andhra University, University of Mumbai, University of Delhi, University of Madras, AMITY, amongst others. Several private institutes are also booming in the space such as IIRM Hyderabad, Chitkara University and others, providing bachelors and masters in actuarial science.  
Read more... 

Source: Analytics India Magazine

If you enjoyed this post, make sure you subscribe to my Email Updates!

Tuesday, April 17, 2018

Why American Students Haven't Gotten Better at Reading in 20 Years | The Atlantic

Photo: Natalie Wexler
"Schools usually focus on teaching comprehension skills instead of general knowledge—even though education researchers know better" according to Natalie Wexler, journalist based in Washington, D.C. Natalie is the co-author of The Writing Revolution.

Photo: Geri Lavrov / Getty

Every two years, education-policy wonks gear up for what has become a time-honored ritual: the release of the Nation’s Report Card. Officially known as the National Assessment of Educational Progress, or NAEP, the data reflect the results of reading and math tests administered to a sample of students across the country. Experts generally consider the tests rigorous and highly reliable—and the scores basically stagnant.

Math scores have been flat since 2009 and reading scores since 1998, with just a third or so of students performing at a level the NAEP defines as “proficient.” Performance gaps between lower-income students and their more affluent peers, among other demographic discrepancies, have remained stubbornly wide.

Among the likely culprits for the stalled progress in math scores: a misalignment between what the NAEP tests and what state standards require teachers to cover at specific grade levels. But what’s the reason for the utter lack of progress in reading scores?

On Tuesday, a panel of experts in Washington, D.C., convened by the federally appointed officials who oversee the NAEP concluded that the root of the problem is the way schools teach reading. The current instructional approach, they agreed, is based on assumptions about how children learn that have been disproven by research over the last several decades—research that the education world has largely failed to heed.

The long-standing view has been that the first several years of elementary school should be devoted to basic reading skills. History, science, and the arts can wait. After all, the argument goes, if kids haven’t learned to read—a task that is theoretically accomplished by third grade—how will they be able to gain knowledge about those subjects through their own reading?

The federal No Child Left Behind legislation, enacted in 2001, only intensified the focus on reading. The statute required states to administer annual reading and math tests to students in grades three through eight and once in high school, and attached hefty consequences if schools failed to boost scores. The law that replaced No Child Left Behind—the Every Student Succeeds Act, enacted in 2015—has eased the consequences but has hardly weakened the emphasis on testing.

What is tested, some educators say, gets taught—and what isn’t doesn’t. Since 2001, the curriculum in many elementary schools has narrowed to little more than a steady diet of reading and math. And when test scores fail to rise after third grade—as they often do, especially in high-poverty schools—subjects like history and science may continue to be relegated to the far back burner through middle school.

To some extent, it does make sense to focus on reading skills in the early years. One component of reading is, like math, primarily a set of skills: the part that involves decoding, or making connections between sounds and the letters that represent them...

...Louisiana har ikke kun lavet sin egen pensum, men har også bedt den føderale regering om tilladelse til at give prøver baseret på denne pensum snarere end passager på en række tilfældigt udvalgte emner. Hvis denne bevægelse spredes, kan den nationale vurdering af uddannelsesfremskridt endelig leve op til sit navn, og det amerikanske uddannelsessystem kan til sidst være i stand til at låse op for muligheden for millioner af studerende.
Read more... 

Recommended Reading: 

Photo: Steve Helber / AP
The Future of College Looks Like the Future of Retail writes Jeffrey Selingo, contributing editor to The Atlantic and the author of There Is Life After College.
"Similar to e-commerce firms, online-degree programs are beginning to incorporate elements of an older-school, brick-and-mortar model."  

Source: The Atlantic

If you enjoyed this post, make sure you subscribe to my Email Updates!