Photo: Michael King |
Photo: EDUCAUSE Review |
In 1984, Apple and IBM were fierce competitors, showcased by one famous Super Bowl ad.1 The two companies were fighting a battle over the personal computer, a market that would see around two million devices sold that year.2 And the biggest robotics story was a small-budget film called The Terminator, about a robot coming back in time to wipe out humanity and starring a relatively unknown Austrian actor. It would become one of the biggest box office hits of the year.
Fast forward thirty-three years. Apple and IBM are now partners, collaborating on mobile applications and services for enterprise clients, and the technology industry is estimated to ship more than two billion devices in 2017.3 In 2015, Arnold Schwarzenegger revitalized his post-gubernatorial career with a big-budget film called Terminator Genisys, about a robot coming back in time to wipe out humanity. The more things change, the more some things stay the same.
This year, the media hype has been all about robots and artificial intelligence (AI). Tesla CEO Elon Musk warned that AI could wipe out humanity. Microsoft Co-Founder Bill Gates weighed in that AI was one of our most pressing threats. And even the preeminent scientist Steven Hawking opined that AI might be mankind's last invention.4 So, 2017 has been the year when AI became the meme that conquered the world. Every startup is promising machine learning, big vendors are rushing to brand their own AI engines, and economists worry that jobs will become a thing of the past. Never mind the robot apocalypse.
The reality of AI is both less dramatic and more impactful than the hype. We are entering a new era of computing that will bring tremendous change. The past thirty-plus years have seen the personalization of computing hardware, with single, big machines supplanted by supercomputers in everyone's pockets and on-demand cloud services. This technical advance has brought individuals access to services anytime, anywhere and has spurred major transformations on college and university campuses. In the new AI era, machine learning and big data, which together enable cognitive computing, will bring personalization through software to every corner of our economy. Higher education will play a critical role in how this new era of the economy evolves, and it's important that academy leaders understand the potential and the risks in order to develop a strategy for navigating the coming disruptions...
The Cognitive Campus
What is the significance for higher education? These developments mean that the college or university will need to become a cognitive campus. Consumer expectations will continue to evolve based on new personalized services. Just as higher education institutions had to embrace personal and mobile computing to serve students and educators, they will need to develop deeper personalized services in the AI era. Students and other constituents will request more personalization as their other service providers get to "know them." The age of hyper-personalization will continue to pressure institutions to meet rising demands in learning and support services.
Second, AI will create opportunities for cost reduction and administrative efficiency, just as in any other industry. Today, AI and big data can improve IT security on campus by identifying threats earlier and creating more rapid interventions. Institutions will deploy AI tools across the student life cycle, improving retention and optimizing outcomes, which frequently have a bottom-line effect on costs.
But the most important issue for higher education is to prepare the next generation to prosper in an AI-driven economy. A persistent debate is whether higher education should focus on preparing students for citizenship or for the workforce. The answer is both. With increasing competition and new alternatives, institutions must align to workforce needs and ensure that their students gain employment. The accelerating rate of change in industry will require institutions to be far more adaptable in the future. That pace of change will make the idea of "lifelong learning" truly an imperative for the individual. In addition, the broad impact of AI on society reinforces the need for higher education to address questions of equity, ethics, and citizenship. Students should be asking themselves: "How should my personal data be used without my knowledge? Should it be? What do AI algorithms do for me? What are their limits?" As AI systems work their way across the economy, we will need a workforce and a citizenry that is deeply familiar with the technology, its capability, and the social issues it creates. We should view this familiarity as a requirement not only for those steeped in the technology but for everyone...
The cognitive era will unleash a new wave of innovation to reengineer business processes, lower costs, and build new personalized services. Industries and jobs will be transformed in an accelerating pace of change. Education will be critical for individuals and society to prosper in this new era. Higher education leaders should begin preparing their institutions for the challenges and opportunities that lie ahead.
Higher education institutions can become the "learning home" for individuals throughout their lives. Cognitive computing will reach prospective students sooner in their learning journey, guide them through the right learning programs, and provide ongoing support to retain the relevance of their skills. Institutions will become vital to the long-term success of their learners. These institutions can be the steward of individual skill profiles and can leverage cognitive tools to become a pervasive advisor in a rapidly changing economy.
Read more...
Source: EDUCAUSE Review