Translate to multiple languages

Subscribe to my Email updates

https://feedburner.google.com/fb/a/mailverify?uri=helgeScherlundelearning
Enjoy what you've read, make sure you subscribe to my Email Updates

Saturday, July 16, 2016

Your midsummer ed-tech check-in | Laura Devaney, Director of News, K-12 and Higher Education


Follow on Twitter as @eSN_Laura
"In this week's news, we take a look back at some of the most popular summer stories up until now. Stay tuned for what we have planned for the rest of this summer." report


Catch up on the most compelling higher-ed news stories you may have missed this week. 

Each Friday, Laura Devaney will be bringing you a recap of some of the most interesting and thought-provoking news developments that occurred over the week. 

I can’t fit all of our news stories here, though, so feel free to visit eCampusNews.com and read up on other news you may have missed.



Photo: eCampus News

Why Teachers College blew up its wireless network
What would you do if you could blow up your school’s entire wireless network and start all over again? That’s exactly what Teachers College, Columbia University has done, as the storied institution seeks to position itself for the 21st century, complete with cutting-edge online instruction and highly automated classrooms.

Read more...
 
30 technologies ushering in the future for education
If the bright lights and casino bling of Las Vegas weren’t enough to dazzle conference-goers from around the world, the awe-inspiring displays of futuristic technology littering the Infocomm 2016 show floor certainly were, such as Epson’s trippy Infinity Room and LG’s mind-bending OLED curved tiling display.

Read more...

Students say campus technology needs major overhaul–but why?
According to a new report, thanks to a lack of digital options and tedious online protocols part of many campus technology initiatives, students say they study less and think less of their university.

Read more...
 

Could these 3 burgeoning nontraditional pathways be a boon for traditional institutions?
Unconventional education organizations say recent high school grads flocking to nontraditional pathways; could this be good for college and university admissions?

Read more...

Additional resources
Catch up on the most compelling K-12 news stories you may have missed this week. 


Photo: eSchool News

Let’s take a look back at some of the most popular stories from the first half of this summer. What do they have in common? They all help classroom teachers and education leaders leverage education technology in the best ways possible.

Your midsummer ed-tech check-in  | eSchool News 

Source: eCampus News 

The next wave of disruption: Graph-based machine learning | IDG Connect

Kathryn Cave, Editor at IDG Connect notes, "We look at the pros and cons of machine learning and graph technology and how the two are now working together."

Photo: IDG Connect

Machine learning (ML) is getting a lot of attention at the moment. This is partly because a slew of new companies are emerging which are using it in innovative ways. And partly because it can get easily subsumed into the fuss and furore about AI and the rise of evil robot intelligence. Graph technology, on the other hand, is something which takes more of a back seat and yet, in a lot of ways, also sits at the forefront of the big data and analytics movement. 

“We firmly believe is that it's at the intersection of machine learning and graph technology where the next evolution lies and where new disruptive companies are emerging,” says Ash Damle, Founder and CEO at Lumiata which helps healthcare organisations makes predictions.

“It's only recently that companies can use graph at true scale and, now, by integrating with ML, we're moving much more into a core understanding of artificial intelligence, deep neural networks and image recognition.”
 
So, in the simplest terms what are these two technologies?
 
At the most basic level machine learning takes large quantities of data to make predictions about future events. While graph technology is more concerned with the relationship between different data points.

Claus Jepsen, Chief Architect, R&D at Unit4 which provides enterprise applications, summarises:

“Machine Learning is really the umbrella and graph technology is a way of representing data when using machine learning.”

While Peter Duffy, CTO of capacity planning as a service provider, Sumerian adds, this means: “There is huge potential for businesses to take advantage of both.”

David Thompson, Sr. Director of Product Management at LightCyber further clarifies: “Graph technology can be considered a type or technique of machine learning, or, at a minimum, aspects of graph technology have strong application to machine learning.”
Read more... 

Source: IDG Connect

How a course about violence changed the way students are taught and assessed | Times Higher Education

Photo: William Watkin
Photo: Will Self
"A literature module developed at Brunel University London has moved away from the traditional essay format and embraced the digital age" according to William Watkin, professor of contemporary literature and philosophy at Brunel University London and Will Self, professor of contemporary thought at Brunel University London and is the author of nine novels.

‘I saw a strangely zombie-like response to the gathering impact of bi-directional digital media’
 

When I began teaching at Brunel University London five years ago I’d had little but glancing contact with the academy since I left higher education myself. My experience of reading for a politics, philosophy and economics degree at the University of Oxford in the late 1970s and early 1980s, may, even for the time, have embodied anachronisms − but the theory and practice of arts and humanities pedagogy I found at Brunel in 2011 remained in essence the same.

Photo: iStock/Getty montage
 At the core of it all, it seems to me, lies the text. At Oxford, I often studied in Duke Humfrey’s library, the dark and woody 14th-century cell deep in the dense honeycomb of the Bodleian. Here, surrounded by ancient tomes, I did my best to impress upon my memory the outline of the canon while at the same time shading in a fraction of its content.

The entire system of learning at Oxford, so far as I can recall, consisted of the combination of mnemonics, composition and argumentation. Reading lists were prodigious: often 20 or 30 items − both entire volumes and journal articles – so redundancy was a given: hours needed to be spent in the library to extract the pith from acres of paper. I took two courses (as modules were then called) every term, and the coursework requirement was an essay of 3,000 words per week for each of them; the sheer amount I had to write gave me the core facility needed for an entire adult working life as a professional writer.

The argumentation was, of course, astonishingly thorough when compared with the meagre “contact hours” most contemporary students are mandated: a full hour vis-à-vis, usually one-to-one, reading out your essay and then picking it apart. Lectures were plentiful and accessible, but I confess: with two hours of tutorials a week, and a minimum of 12 to write my papers, I needed all the remaining ones simply in order to read, if I were to be able to absorb sufficient information to substantiate the sort of large-scale theoretical paradigms I was being introduced to.

As I say, when I arrived at Brunel I found the lineaments of this system still present: reading lists and essay assignments; lectures, seminars and tutorials. I also realised immediately the deep commitment many of my colleagues had to serious, effective pedagogy, and their preparedness to do right by students facing massively increased pressures owing to the marketisation of the sector. However, what I also saw (and I believe this, in part, to be one of the many unforeseen consequences of the 2010 Browne “reforms”, which ushered in the tripling of tuition fees) was a strangely zombie-like response to the gathering impact of bi-directional digital media (BDDM) on the study of arts and the humanities. This is at once a vast subject − and just one aspect of the technological revolution we’re living through; one of such scale, rapidity and obvious transformative potential, it deserves to − and does − generate ever more baroque and reflexive forms of appraisal and criticality. That being noted, there are simple things to be said, and for me they coalesce around a single conceptual object: the skeuomorph.

‘My students chose to look away not out of fear or moral revulsion but out of ennui’
 

This was the first year of my new, final-year literature course at Brunel, entitled Violence. It is, I believe, a unique course in the UK, allowing English literature students the opportunity to engage with the art, theory, politics and technology of violence.

Photo: Getty/Alamy/iStock montage
The course is an innovation in how to study literary production in that it is not determined by a particular period, “ism” or theoretical approach. It is not even dominated by the need to study works of literature, taking, as it does, textual study in its absolutely widest possible sense. Novels are texts; paintings are text; philosophy is text; film is text. But YouTube is also text; Twitter is text; wolf whistles can be text. In the final week of the course, the reading set for the class said simply: “the internet”.

The idea behind the course was to let urgent issues dictate the nature of its design, rather than apply already tested frameworks of study to, say, representations of violence online. I envisage that this may be one way forward for literary studies and the humanities in general: let the topic dictate the course and then discover new methods and materials to answer its call. It is a form of curriculum design not dissimilar to the much-discussed phenomenon-based learning, which is centred on real-world phenomena rather than the abstractions studied by traditional academic subjects. Areas we covered included founding violence (or the role of dramatic acts of violence in founding and protecting states), discursive violence (how language and representation can be a form of violence), animal violence, sexual violence, sadism, cannibalism, scapegoating, punishment, surveillance, decapitation videos and, of course, zombies.

I also attempted to develop novel ways to deliver and assess the content. The lecturing team – which included my colleagues Will Self and film studies lecturer Daniele Rugo – were given three hours to present, interact, challenge and discuss whatever material they thought fit. I wrote original pieces of journalism, spoke from my blogs and used a complex set of platforms, including YouTube clips, feature films, internet image curation and Twitter, to discuss, say, the concept of founding violence or capital punishment. At one stage, I presented my zombie walk to introduce a discussion of the ontological issues surrounding the undead. This wasn’t just a joke. Performative lecturing techniques are essential if we want to keep the students in class and off their phones. A middle-aged man who should know better doing a, frankly, rather brilliant, zombie shuffle which segued into an admittedly inexpert moonwalk (the link being Michael Jackson’s Thriller video) is at least one way to get the students’ attention. But don’t worry: in the same session we also studied the Italian philosopher Giorgio Agamben’s ruminations on the homo sacer (a Roman criminal whom anyone was permitted to kill without being considered a murderer). I am not a complete charlatan.
Read more... 

Source: Times Higher Education

Why African businesses need to embrace open source software | IDG Connect

Vincent Matinde, international IT Journalist summarizes, "Africa is opening its eyes to the power of open software solutions." 
Photo: IDG Connect
Software purchase is still an expensive affair for African companies. This has made dabbling in counterfeit software the norm for small and medium enterprises although it is not an option for most established brands.

Open software has been a reality for many developers and ICT players around the globe. In Africa, companies are now finally embracing this ‘cheaper’ way of owning systems as more IT companies launch open source based solutions.

Most people are familiar with open source projects such as WordPress, Joomla and Drupal which help users get their sites and blogs set up quickly. But there is more to open source than blogging. Most internal processes in companies need more advanced software protocols.

Dominic Kebenei, an IT specialist at Kenya’s leading media outlet, Standard Media Group says that the implementation of an open software system did not only lower their costs but also provided an integrated approach in managing company software.

“We had outgrown our previous approach to managing the various functions of the business and looked for a more integrated approach,” he said.
 
The implementation of open source solution, built on top of Linux, helped the company to consolidate its finance, procurement, sales and distribution processes seamlessly.

“As a media company, security is a major concern for us,” Kebenei stated, “Linux offers fine grained control over authentication and permissions, enabling us to give the right level of control to different sysadmins.”

Shadrack Serem, the CTO of a development company in Nairobi, Netrix Business Systems Ltd concurs that the use of open source can help drive business initiatives.

Serem has been using open software for the last five years to create websites but also engineer ERP systems used in companies. He is also an avid participant in the CMS Africa Summit that brings together open source techies from around Africa.

“The main thing about open source is the community behind the projects, there is always a hive of contributors and reviewers participating in enhancing and advancing the project,” Serem told IDG Connect.

“This results in a more feature rich project and more secure as well. There is also the factor of ease of access and usage that comes with open source. Open source licenses are mostly free and flexible.”

Serem said that such advantages can push Africa companies in terms of technology deployments.

He added that: “Africa being an emerging economy means most companies and governments are not that high up the revenue scale, this can limit their access to certain technologies that enhance their productivity and service delivery due to their proprietary and mostly expensive nature. The onset of open source alternatives circumvents this and provides them with an avenue to be more productive as well as reach a larger audience.”
Read more... 

Source: IDG Connect

Friday, July 08, 2016

Keeping tabs on tech trends | Laura Devaney, Director of News, K-12 and Higher Education


Follow on Twitter as @eSN_Laura
notes, "In this week's news, we've gathered the latest technology trends together to give you a glimpse of how those trends are influencing higher education." 







Photo: eCampus News

New tech to revolutionize studying abroad education
An international education organization often used by study abroad programs is harnessing the power of technology-supported blended teaching and learning to refine study abroad education programs by connecting students and faculty around the world and preparing learners to translate their global experience into a skill set that prepares them for the modern economy.

Read more...

NMC launches app for the higher ed trends data
As part of their strategic partnership, the NMC and Mirum Learning have released a mobile app to better inform university and college professionals of the developments poised to disrupt the field. The app, now freely available for the iPhone and iPad and Android tablets in the iTunes and Google app stores, is an interactive version of the NMC Horizon Report > 2016 Higher Education Edition, a publication that has garnered a quarter of a million downloads in nearly 200 countries since its launch in February. Now in its 15th year. the NMC Horizon Project is the longest-standing study into emerging technology uptake in education.

Read more...

Higher-ed IT leaders list their top 10 priorities
Though colleges and universities are open environments, campus IT leaders face a number of challenges as digital technology use grows, said members of a recent national panel. In other words, BYOD isn’t a problem of the past, it’s still very much the challenge now. Also, security trumps all.

Read more...

Additional resources
Catch up on the most compelling K-12 news stories you may have missed this week.


This week, we’re focusing a bit on numbers. How many strategies can you apply to encourage teacher innovation? Do you know how many apps there are to help teach students with autism? And what’s changing for today’s CTOs? And what tools can help you implement formative assessment? 

Photo: eSchool News
Classroom technology: The numbers edition | eSchool News 

Source: eCampus News 

Europe’s universities: an unbreakable alliance | Times Higher Education

Photo: Howard Hotson
Universities have a European union far older than many nation states. We must not lose sight of that in the aftermath of Brexit, says Howard Hotson, professor of early modern intellectual history at the University of Oxford.

Photo: Getty

Up and down the UK, academics have spent the past fortnight spinning through cycles of shock, disbelief, terror, outrage and denial in the wake of the country’s vote to leave the European Union. The Brexit debate has exposed deep divisions within the country at large, but it has also united the university community like no other issue in recent times.
 
The bodies representing universities – including the Russell Group, Universities UK and MillionPlus – were unanimous in their support for Remain. The vice-chancellors of 103 universities published an open letter three days before the vote, urging that remaining in the EU was “necessary for the UK to maintain its position as a highly skilled and a globally competitive knowledge economy”. The rest of the university community, which so often objects to this kind of instrumentalist argument, is for once voicing full-throated agreement. According to Times Higher Education’s own poll, nearly 90 per cent of those working in higher education wanted to stay within the EU. Nearly 70 per cent of UK students also planned to vote “in”, and university alumni strongly support Remain. In all likelihood, the universities are the most uniformly and passionately pro-European constituency in the country.

How, then, to account for this remarkable unanimity? Economic self-interest is always the first explanation invoked in our neoliberal age. The UK is one of the chief beneficiaries of EU research funding. Up to 2013, for instance, it received a larger share of Framework Programme 7 funding (15.1 per cent) than any other country besides Germany, as well as 22 per cent of all funding granted by the European Research Council: double the rate of the UK’s contribution to the EU budget as a whole (11.5 per cent). EU sources account for 10 per cent of the UK’s academic research funding, including large-scale consortia and high-risk, high-gain international research projects for which there are no other funding streams. That’s a powerful and simple argument, of precisely the kind that should work well in a referendum debate. But it didn’t win the debate, and it’s not the whole story. Wales and Cornwall are also huge net beneficiaries of EU funding, but they both opted to leave.
 
Materials for a richer explanation of Remain’s hold on the university can be found in the deep recesses of history. Simply put, the university, in origin, is not a national institution. In fact, most of Europe’s oldest universities are far older than the nation states in which they are currently located. When the first university was founded (traditional date: 1088), Bologna was a semi-autonomous civic commune near the southern boundary of the Holy Roman Empire. By 1500, a dozen further universities had been founded within the still fragmented Italian peninsula, in independent republics, the Papal States and the kingdom of Naples.
 
Elsewhere, the situation was similar. The oldest university in modern Spain (Salamanca, 1134) was founded in the kingdom of Léon, which occupied the northwestern corner of the Iberian peninsula, including parts of modern Spain and Portugal. Alcalá (1293) and Santiago (1495) were established within the kingdom of Castille; Barcelona (1450) and Valencia (1499) in the kingdom of Aragon.
The oldest universities in the Czech Republic (Prague, 1348), Austria (Vienna, 1365), Germany (Heidelberg, 1386), Belgium (Leuven, 1425) and Switzerland (Basel, 1460) were all established within territories subject to the Holy Roman Empire, and so were scores more before 1806. 

The oldest universities in Scandinavia, Uppsala (1477) and Copenhagen (1479), appeared in a period in which Sweden and Denmark were united in the Kalmar Union. The oldest universities in Europe’s northeastern and southeastern corners – in Estonia (Tartu, 1632), Finland (Turku/Helsinki, 1640) and Croatia (Zagreb, 1669) – were established within Sweden’s Baltic and Austria’s Balkan empires. Scotland’s ancient universities – St Andrews (1413), Glasgow (1451), Aberdeen (1495) and Edinburgh (1582) – were founded long before the formation of the UK in 1707, and may well survive to see its break-up.
Read more...

Source: Times Higher Education

A Library Devoted to Educational Content Development | A Pass Educational Group, LLC

Andrew Pass inform, "Well today, I'm thrilled to unveil the new A Pass Educational Group content resource library. 


It has more than 42 items including blogs, angles, white papers, webinars, and podcasts. I am very proud of it. Just one year ago I decided that I wanted A Pass to produce more informational resources and today we have accomplished that goal." 

Take a look at the A Pass library!! 

Source: A Pass Educational Group, LLC

Wednesday, July 06, 2016

The difference between Statistical Modeling and Machine Learning, as I see it | Pulse - LinkedIn

Photo: Oliver Schabenberger
Oliver Schabenberger, Ph.D., Senior Research Statistician at SAS Institute and has been using SAS software since 1991, frequently get asked about the differences between Statistics (statistical modeling in particular), Machine Learning and Artificial Intelligence. There is indeed overlap in goals, technologies and algorithms. Confusion arises not only from this overlap, but from the buzzword salad we are being fed in non-scientific articles.


Statistical Modeling 
The basic goal of Statistical Modeling is to answer the question, “Which probabilistic model could have generated the data I observed?” So you:
  • Select a candidate model from a reasonable family of models
  • Estimate its unknown quantities (the parameters; aka fit the model to data)
  • Compare the fitted model to alternative models
For example, if your data represent counts, such as the number of customers churned or cells divided, then a model from the Poisson family, or the Negative Binomial family, or a zero-inflated model might be appropriate.

Once a statistical model has been chosen, the estimated model serves as the device for inquiries: testing hypotheses, creating predicted values, measures of confidence. The estimated model becomes the lens through which we interpret the data. We never claim that the selected model generated the data but view it as a reasonable approximation of the stochastic process on which confirmatory inference is based...

Classical machine learning 
Classical machine learning is a data-driven effort, focused on algorithms for regression and classification, and motivated by pattern recognition. The underlying stochastic mechanism is often secondary and not of immediate interest. Of course, many machine learning techniques can be framed through stochastic models and processes, but the data are not thought in terms of having been generated by that model. Instead, the primary concern is to identify the algorithm or technique (or ensemble thereof) that performs the specific task: Are customers best segmented by k-means clustering, or DBSCAN, or a decision tree, or random forest, or SVM?

In a nutshell, for the Statistician the model comes first; for the Machine Learner the data are first. Because the emphasis in machine learning is on the data, not the model, validation techniques that separate data into training and test sets are very important. The quality of a solution lies not in a p-value, but in proving how well the solution performs on previously unseen data. Fitting a statistical model to a set of data and training a decision tree to a set of data involves estimation of unknown quantities. The best split points of the tree are determined from the data as are the estimates of the parameters of the conditional distribution of the dependent variable...

Modern Machine Learning 
A machine learning system is truly a learning system if it is not programmed to perform a task, but is programmed to learn to perform the task. I refer to this as Modern Machine Learning. Like the classical variant, it is a data-driven exercise. Unlike the classical variant, modern machine learning does not rely on a rich set of algorithmic techniques. Almost all applications of this form of machine learning are based on deep neural networks.
Read more... 

Source: Pulse - LinkedIn

The Future of Work and Artificial Intelligence | Talent Management

Lauren Dixon, Associate Editor, Talent Management  summarizes, "Technological innovation, such as artificial intelligence, requires that companies and individuals take ownership of education to keep up with the future of work."

Photo: Talent Management

In 2012, Dennis Mortensen had 1,019 meetings, each of which required an average of roughly eight back-and-forth emails to schedule. Now his personal assistant, Amy Ingram, schedules his meetings for him.

There’s just one thing: Amy isn’t a human being — she’s a virtual assistant (notice her initials, AI). Every time Mortensen comes across a contact interested in meeting with him, the CEO and founder of New York City-based artificial intelligence firm x.ai simply sends them a return email copying Amy, who takes care of the rest.

“In raw numbers, I’ve saved about an hour every day — an hour which I would otherwise have to use in really rudimentary work where I add not much value,” Mortensen said of Amy’s help scheduling meetings.

Virtual assistants like Amy are becoming more common. Just as household technology platforms like Apple’s “Siri” and Microsoft’s “Cortana” has helped consumers navigate their lives more easily, other forms of rudimentary artificial intelligence platforms are starting to proliferate the market, many of them upending traditional business roles.

To this point, such technological innovation has brought nothing but excitement to those who have found its benefits. Still, as AI and automation technology advances, some are fearful of the wide-scale human job loss it might bring.

In some instances, this is already happening. Automation in sectors like manufacturing has taken over jobs long held by humans. Experts predict jobs rooted in repetitive, computational tasks are likely to succumb to technology in the not-too-distant future.

Moshe Vardi, an esteemed computer science professor from Rice University in Houston, grabbed media attention in February when he boldly predicted to attendees at the annual meeting of the American Association for Advancement of Science that artificial intelligence could displace almost half of the world’s population out of their jobs in the next 30 years.

Others who fear a robotic takeover are Tesla and SpaceX CEO and founder Elon Musk and physicist Stephen Hawking. Both are not just fearful of AI’s ability to displace human jobs in the economy; they think robots could potentially end mankind altogether.

Not everyone is as bearish on the future of jobs and technology — or humanity. Dermot O’Brien, chief human resources officer at payroll provider and consulting firm Automatic Data Processing Inc., or ADP, said technological advances are more likely to further enable human productivity, not replace it entirely. People will “be able to focus on more the interesting, higher-impact side of their roles,” he said.

Talent managers play an important role in technology’s integration in the economy, experts say, particularly as it pertains to employee development. With more employees expected to integrate technology in their work, a greater emphasis on learning and development will be needed to keep pace.
Read more...

Source: Talent Management

Study exposes major flaw in classic artificial intelligence test | Science Daily

ScienceDaily reports, "A serious problem in the Turing test for computer intelligence is exposed in a study published in the Journal of Experimental and Theoretical Artificial Intelligence."



If a machine were to 'take the Fifth Amendment' -- that is, exercise the right to remain silent throughout the test -- it could, potentially, pass the test and thus be regarded as a thinking entity, authors Kevin Warwick and Huma Shah of Coventry University argue. However, if this is the case, any silent entity could pass the test, even if it were clearly incapable of thought.

The test, devised in 1950 by pioneering computer scientist Alan Turing, assesses a machine's ability to exhibit intelligent behaviour indistinguishable from that of a human. Also known as the 'imitation game', it requires a human judge to converse with two hidden entities, a human and a machine, and then determine which is which.

Warwick and Shah's study looks at transcripts of a number of conversations from actual Turing tests in which the hidden machine remained silent. In each case, the human judge was unable to say for certain whether they were interacting with a person or a machine.

Thus, a machine could potentially pass the Turing test simply by remaining silent. The judge would be unable to determine whether the silent entity was a human choosing not to answer the questions, a smart machine that had decided not to reply, or a machine experiencing technical problems that prevented it from answering (as was actually the case in the transcripts studied).
Read more... 

Additional resources
Journal Reference:
  1. Kevin Warwick, Huma Shah. Taking the fifth amendment in Turing’s imitation game. Journal of Experimental & Theoretical Artificial Intelligence, 2016; 1 DOI: 10.1080/0952813X.2015.1132273
Source: ScienceDaily