Translate to multiple languages

Subscribe to my Email updates

https://feedburner.google.com/fb/a/mailverify?uri=helgeScherlundelearning
Enjoy what you've read, make sure you subscribe to my Email Updates

Sunday, October 25, 2020

New teaching model grooms students for the data age | Singapore - University World News

Kalinga Seneviratne, University World News inform, Singapore’s Nanyang Polytechnic (NYP), partnering with industry players, is launching a bold and unprecedented approach to course design and delivery by introducing a new Professional Competency Model (PCM) for students entering their diploma in business intelligence and analytics (DBIA) next year.

Photo: Nanyang Polytechnic (NYP) 
It will move away from the usual subject-based approach to one that is structured around workplace competencies...

“The NYP PCM transforms the fundamentals of teaching and learning in polytechnic education. The key focus is teaching competencies pegged to a specific work function and taught collectively in a competency unit (CU) unlike the traditional curriculum model, which breaks down competencies into different subjects for teaching,” explained NYP Principal and CEO Jeanne Liew in an interview with 

When asked about whether NYP is drifting away from the traditional lecture-based teaching model, Liew argued that learning in a real-world context allows the experience to be more meaningful and effective.

“Traditionally, you’d break down the integrated blend of academic content and skills into subjects: statistics and maths, for instance, to deal with data,” she explained.

 

Saturday, October 24, 2020

A math idea that may dramatically reduce the dataset size needed to train AI systems | Machine learning & AI - Tech Xplore

A pair of statisticians at the University of Waterloo has proposed a math process idea that might allow for teaching AI systems without the need for a large dataset, as Tech Xplore reports.

Photo: CC0 Public Domain

Ilia Sucholutsky and Matthias Schonlau have written a paper describing their idea and published it on the arXiv preprint server.

Artificial intelligence (AI) applications have been the subject of much research lately, with the development of , researchers in a wide range of fields began finding uses for it, including creating deepfake videos, board game applications and medical diagnostics.

Deep learning networks require large datasets in order to detect patterns revealing how to perform a given task, such as picking a certain face out of a crowd. In this new effort, the researchers wondered if there might be a way to reduce the size of the dataset. They noted that children only need to see a couple of pictures of an animal to recognize other examples. Being statisticians, they wondered if there might be a way to use mathematics to solve the problem.

The researchers built on recent work by a team at MIT.


Additional resources 
'Less Than One'-Shot Learning: Learning N Classes From M < N Samples, arXiv:2009.08449 [cs.LG]

Source: Tech Xplore  

How I passed the TensorFlow Developer Certification Exam | Syndication - TNW

This article was written by Daniel Bourke and was originally published on Towards Data Science. You can read it here.

Daniel Bourke, Author at The Next Web said, At the start of May, I decided to get TensorFlow Developer Certified. So I set myself up with a curriculum to sharpen my skills and took the certification — turns out, I passed.

Photo: Unsplash

Let me tell you how I did it and how you can too.

Hold on, what even is TensorFlow?

TensorFlow is an open-source numerical computing framework that allows you preprocess data, model data (find patterns in it, typically with deep learning) and deploy your solutions to the world.

It’s what Google uses to power all of its machine learning services. Chances are, the device you’re reading this on has run some kind of TensorFlow before.

How to prepare for the exam

When I decided I wanted to I went through the certification website and read the TensorFlow Developer Certification Handbook.

From these two resources, I built the following curriculum.

Curriculum — what I studied to build the skills necessary for passing the exam

It should be noted that before I started studying for the exam, I had some hands-on experience building several projects with TensorFlow.

The experienced TensorFlow and deep learning practitioner will likely find they can go through the following curriculum at about the same pace I did (three weeks total), maybe faster.

The beginner will want to take as much time as needed. Remember: building any worthwhile skill takes time. 

Read more... 

Source: TNW 

Stephen Wolfram Battles the Theory of Everything |Physics - Medium

~ a surprisingly coherent path toward understanding the Universe ~

I can feel your reservations regarding Wolfram’s recent claim of a new way to find a fundamental, all-encompassing theory of physics writes Anthony Repetto, Easily distracted mathematician. 

Photo: Science in HD on Unsplash
We hear that a lot. And, Wolfram’s track-record for verbose grandeur, mentioning the foundational work of others only as an aside, has been a valid lingering criticism. Yet, he has been doing a different kind of scientific experiment, using computer simulations of numerous sorts of universes, and the appearance of new kinds of looking are precisely when people discover revolutionary new things. Is that what Wolfram found? Not precisely, but we can cover the gist of his team’s work, and see what might grow from his unique approach.

Universal Rules

Physicists keep looking for the most fundamental laws. Some small set of rules that combine to generate space, time, energy, mass, gravity, magnetism, etc. No scientist expects that, upon finding those rules, we would then be able to ‘simulate everything’ or ‘predict the future’ or ‘know all truths’. Nope. They’re just looking for the most basic instructions that create a cosmos like ours. Like knowing the general way in which electrons are zipping where in a computer, instead of knowing what the software is about to do. That’s all...

Yet, our universe has many particularities — the interactions between particles of each universe-rule are radically distinct from each other. If we find a universe that makes quarks and electrons behave as they do here, then we’re probably done.


Source: Medium 

The two most important barriers to deep learning | Online Learning & Elearning - The Tech Edvocate

Have you ever sat and tried to remember something—tried for all your worth and still failed?  reports Matthew Lynch, Author at The Edvocate.

The Two Main Barriers Against Deep Learning
Photo: The Tech Edvocate
My sister used to call this a “brain fart.” 

Now, look at it from the other angle. Have you ever tried to learn something—learn with all your might and still come up short? Maybe you were trying to figure out genetics… 
or your mom’s favorite pancake recipe. Perhaps it was Algebra 2… or how your dad always hit the target just right and made it look so easy. Maybe you were battling one of the two main barriers against deep learning.

What Is Deep Learning?
“Deep learning” is actually a machine term. It’s an AI (artificial intelligence) function. Artificial Intelligence is programs that allow machines to do the things that it typically takes a human to do...

Conclusion
When students learn through deep learning, they remember the material longer and can recall the material more easily. When you work to overcome these two main barriers against deep learning, your students will be able to access deep learning. They will better understand the material, and their grades will rise.

Digital diagnosis: Why teaching computers to read medical records could help against COVID-19 | COVID-19 - World Economic Forum

This article is published in collaboration with The Conversation.
  • Natural language processing (NLP) algorithms could find patterns across many thousands of patients’ records, helping to find effective treatments. 
  • They could also help to predict which patients are more likely to become seriously ill with COVID-19 - and predict upcoming surges of the pandemic. 

Medical records are a rich source of health data. When combined, the information they contain can help researchers better understand diseases and treat them more effectively.
 

Information gained from computer models could prove critical in the fight against coronavirus.
Photo: REUTERS/Yves Herman
This includes COVID-19. But to unlock this rich resource, researchers first need to read it.

We may have moved on from the days of handwritten medical notes, but the information recorded in modern electronic health records can be just as hard to access and interpret. It’s an old joke that doctors’ handwriting is illegible, but it turns out their typing isn’t much better.. 

Finding ways to fight COVID-19
By drawing together health records using these tools, we’re now using these techniques to see patterns that are relevant to the pandemic. For example, we recently used our tools to discover whether drugs commonly prescribed to treat high blood pressure, diabetes and other conditions – known as angiotensin-converting enzyme inhibitors (ACEIs) 
angiotensin receptor blockers (ARBs) – increase the chances of becoming severely ill with COVID-19.


Can Artificial Intelligence Help Students Work Better Together? According to Research, the Answer is Yes | Artificial Intelligence - WPI News

WPI computer scientist looks at better collaboration between AI and humans in today’s classrooms by Jessica Marie Messier, Public Relations Specialist at Worcester Polytechnic Institute.

Jake Whitehill (center) is studying which AI Partners will help students collaborate better in the classroom.
WPI assistant professor of computer science Jacob Whitehill is collaborating with colleagues at the University of Colorado Boulder to explore how artificially intelligent (AI) teaching agents might help encourage more meaningful collaboration among students in school classrooms.

As part of a five-year, $20 million grant awarded to CU Boulder, Whitehilll and colleagues at nine other institutions will study how to build AI agents that interact with students in small-group collaborative problem-solving settings and thereby foster more enriching learning experiences for students.

There is so much evidence that students learn best when they learn actively and collaborate with each other in real time,” says Whitehill, who specializes in AI and machine learning. “But that’s not always easy for teachers to achieve, whether the collaboration is happening in a physical classroom or during remote learning.”...

“With Zoom, each student and teacher in the classroom is cleanly separated from each other, and all their audiovisual inputs are channeled through a common software interface. This makes it much easier to analyze their speech, gestures, language, and interactions with each other,” Whitehill says. “In contrast, in normal, in-person classrooms, the interactions are much messier,” since students often sit in all kinds of different positions, might be touching their faces, and work in a noisy environment, which makes it more challenging for the Partner to observe and analyze.


Source: WPI News 

Friday, October 23, 2020

International Open Access Week 2020: Opening the book | Books - OUPblog (blog)

Andy Redman, OUPblog says, Often when we talk about open access (OA), we talk about research articles in journals, but for over a decade there has been a growing movement in OA monograph publishing. 

Photo: Dmitry Zvolskiy from Pexels

To date, Oxford University Press (OUP) has published 115 OA books and that number increases year on year, partly through an increasing range of funder initiatives and partly through opportunities to experiment.

Increasingly, the policy conversation recognises that the drivers for OA are as applicable to books as they are to research articles, and research funders and policy makers are looking for ways to increase the volume of OA book publishing, but how simple is it to apply the accelerator?...

A book is not a journal
It sounds obvious but there are significant differences in what books are, and how they develop from a period of research with practical consequences for the extent to which processes that have been applied to open access journals can be made to apply to open access books. 

There are several reasons for this:..

While we don’t have all of the answers and cannot do this alone, we do have some important questions that must be tackled if we are to move the discussion forwards:

  • What is an effective funding model for OA book publishing which takes into account both the time spent by the author in researching and producing the work and the time spent by the publisher in helping to shape and disseminate it?
  • What timeframes are appropriate for dissemination for a long-form piece of research like a monograph which has a long life of citation and discovery?
  • How can we ensure scholars are not closed out of the move towards OA because the current models do not fit their research areas or funding opportunities?
As we reach the end of International Open Access Week for another year, these are the questions that we will be taking forward in our conversations with policy makers and funders alongside our own publishing models as we look to support our authors and readers through the transition to a more open world for research.

Read more...

What is deep learning? | Deep Learning - FierceElectronics

Deep learning is one of the most promising techniques for training machines to "think" like people by Cabe Atwell, Engineer, Machinist, Maker, Writer.

Photo: Danor_a /iStock/Getty Images Plus

Technology continues to advance at incredible rates. From self-driving vehicles to supercomputers and autonomous drones, the kind of world depicted in I, Robot, is becoming our reality – for better or worse. What’s driving this technology? Artificial intelligence (AI) and its many subcategories, including machine and deep learning. 

While AI is an enormous field that encompasses all machines that can execute tasks that otherwise require human intellect, and machine learning is the process by which machines can learn new tasks autonomously, deep learning is a different beast entirely. Deep learning is a subset of both AI and machine learning and attempts to build the first machines capable of human-like thought. It relies on algorithms to process information and ‘learn’ new skills in a way that mimics how the human brain learns. 

So, if deep learning is about training machines to process code like the human brain, how does it work?... 

Deep learning is the future of AIDeep learning is still an early development, but it’s certainly one of the more powerful techniques in use today to make autonomous, self-thinking machines. While today it’s used in applications such as virtual assistants, marketing algorithms, facial recognition software, and chatbots, this same technology will also power self-driving cars, autonomous data-gathering drones, and so much more in the not-so-distant future. 

If you’re interested in learning more about deep learning, check out the book Deep Learning (MIT Press), a foundational text in the field. 

Read more...

Recommended Reading 

(Adaptive Computation and Machine Learning series)

Bridging the Skills Gap for AI and Machine Learning | Data science - Integration Developers

Ryohei Fujimaki, CEO & Founder at dotData, Inc looks at the latest trends in AI/ML automation – and how they will speed adoption across industries. 

Bridging the Skills Gap for AI and Machine Learning

COVID-19 has impacted businesses across the globe, from closures to supply chain interruptions to resource scarcity. As businesses adjust to the new normal, many are looking to do more with less and find ways to optimize their current business investments.

In this resource-constrained environment, many types of business investments have slowed dramatically. That said, investments in AI and machine learning are accelerated, according to a recent Adweek survey

The shortage of data scientists — as well as data architects, machine learning engineers skilled in building, testing, and deploying ML models — has created a big challenge for businesses implementing AI and ML initiatives, limiting the scale of data science projects and slowing time to production. The scarcity of data scientists has also created a quandary for organizations: how can they change the way they do data science, empowering the teams they already have?