Translate into a different language

Tuesday, July 19, 2016

What Is A Paradigm Shift, Anyway? | WBAA

Photo: Tania Lombrozo
Tania Lombrozo, psychology professor at the University of California, Berkeley summarizes, "Thomas Kuhn, the well-known physicist, philosopher and historian of science, was born 94 years ago today. He went on to become an important and broad-ranging thinker, and one of the most influential philosophers of the 20th century."

Photo: Yagi Studio / Getty Images


The Structure of Scientific Revolutions
Kuhn's 1962 book, The Structure of Scientific Revolutions, transformed the philosophy of science and changed the way many scientists think about their work. But his influence extended well beyond the academy: The book was widely read — and seeped into popular culture. One measure of his influence is the widespread use of the term "paradigm shift," which he introduced in articulating his views about how science changes over time.

Inspired, in part, by the theories of psychologist Jean Piaget, who saw children's development as a series of discrete stages marked by periods of transition, Kuhn posited two kinds of scientific change: incremental developments in the course of what he called "normal science," and scientific revolutions that punctuate these more stable periods. He suggested that scientific revolutions are not a matter of incremental advance; they involve "paradigm shifts."

Talk of paradigms and paradigm shifts has since become commonplace — not only in science, but also in business, social movements and beyond. In a column at The Globe and Mail, Robert Fulford describes paradigm as "a crossover hit: It moved nimbly from science to culture to sports to business."

But what, exactly, is a paradigm shift? Or, for that matter, a paradigm?

The Merriam-Webster dictionary offers the following:

Simple Definition of paradigm:
  • : a model or pattern for something that may be copied

  • : a theory or a group of ideas about how something should be done, made, or thought about

Accordingly, a paradigm shift is defined as "an important change that happens when the usual way of thinking about or doing something is replaced by a new and different way."
More than 50 years after Kuhn's famous book, these definitions may seem intuitive rather than technical. But do they capture what Kuhn actually had in mind in developing an account of scientific change? 
Read more...

Source: WBAA


If you enjoyed this post, make sure you subscribe to my Email Updates!

Monday, July 18, 2016

An eBook by Trivantis - eLearning 101: A Practical Guide | Trivantis


Welcome to the wonderful world of eLearning—a place where you can create and deliver exciting and helpful instructional content, and then implement it online.
Download the free eBook now

Inside this eBook, you'll find a complete guide to the foundations of eLearning development. Get the scoop on instructional design methods and go behind the scenes of eLearning development—from phase one to completion. This eBook is your info-packed resource to all things eLearning. 


 What’s inside the new eBook?
  • The history of eLearning
  • An overview of common Instructional Design methods
  • Behind the scenes of eLearning development, from phase 1 to completion
  • Graphic design tips and more
Trivantis writes, "What Is eLearning?"
eLearning is learning that happens on an electronic device. It also refers to the practice of creating a learning activity using a tool capable of web distribution to be placed into an online repository. But that leaves the following questions:
  1. What type of electronic device?
  2. What is an online repository?
  3. What is a learning activity?
  4. Which tools are capable of web distribution?

Download the free eBook now 

Source: Trivantis 


If you enjoyed this post, make sure you subscribe to my Email Updates!

Artificial Intelligence Swarms Silicon Valley on Wings and Wheels | New York Times

Photo: John Markoff
John Markoff, senior writer for The New York Times reports, "For more than a decade, Silicon Valley’s technology investors and entrepreneurs obsessed over social media and mobile apps that helped people do things like find new friends, fetch a ride home or crowdsource a review of a product or a movie."

A Bossa Nova robot gliding through a store aisle to check inventory.
Photo: New York Times

Now Silicon Valley has found its next shiny new thing. And it does not have a “Like” button.
The new era in Silicon Valley centers on artificial intelligence and robots, a transformation that many believe will have a payoff on the scale of the personal computing industry or the commercial internet, two previous generations that spread computing globally. Computers have begun to speak, listen and see, as well as sprout legs, wings and wheels to move unfettered in the world.

The shift was evident in a Lowe’s home improvement store here this month, when a prototype inventory checker developed by Bossa Nova Robotics silently glided through the aisles using computer vision to automatically perform a task that humans have done manually for centuries.

The robot, which was skilled enough to autonomously move out of the way of shoppers and avoid unexpected obstacles in the aisles, alerted people to its presence with soft birdsong chirps. Gliding down the middle of an aisle at a leisurely pace, it can recognize bar codes on shelves, and it uses a laser to detect which items are out of stock.

Silicon Valley’s financiers and entrepreneurs are digging into artificial intelligence with remarkable exuberance. The region now has at least 19 companies designing self-driving cars and trucks, up from a handful five years ago. There are also more than a half-dozen types of mobile robots, including robotic bellhops and aerial drones, being commercialized.

“We saw a slow trickle in investments in robotics, and suddenly, boom — there seem to be a dozen companies securing large investment rounds focusing on specific robotic niches,” said Martin Hitch, chief executive of Bossa Nova, which has a base in San Francisco.

Funding in A.I. start-ups has increased more than fourfold to $681 million in 2015, from $145 million in 2011, according to the market research firm CB Insights. The firm estimates that new investments will reach $1.2 billion this year, up 76 percent from last year.

“Whenever there is a new idea, the valley swarms it,” said Jen-Hsun Huang, chief executive of Nvidia, a chip maker that was founded to make graphic processors for the video game business but that has turned decisively toward artificial intelligence applications in the last year. “But you have to wait for a good idea, and good ideas don’t happen every day.”

By contrast, funding for social media start-ups peaked in 2011 before plunging. That year, venture capital firms made 66 social media deals and pumped in $2.4 billion. So far this year, there have been just 10 social media investments, totaling $6.9 million, according to CB Insights. Last month, the professional social networking site LinkedIn was sold to Microsoft for $26.2 billion, underscoring that social media has become a mature market sector.
Read more...

Source: New York Times


If you enjoyed this post, make sure you subscribe to my Email Updates!

Five Ways to Help Students Succeed in the Online Classroom | Faculty Focus


Photo: Amy Hankins
"More and more students are flocking to the online classroom for the convenience of earning college credits from the comfort of their home. However, many of these students are ill-prepared for the dedication and discipline needed to be successful in the online environment." according to Amy Hankins, worked in education for 10 years, including online learning for eight years. Currently she is working as a full-time instructor for an online university.
Oftentimes students have misconceptions concerning the rigor of online courses, and they often underestimate the amount of time and discipline necessary to complete assignments, discussions, quizzes, and projects. Therefore, it is important for the instructor to set the tone of the course to help students succeed. So how do you help your students succeed in the online classroom?
  1. Provide Detailed Instructions and Anticipate Questions. Ensure all instructions are easy to follow. Provide step-by-step instructions and ensure no detail is overlooked. Do not assume students will be able to read between the lines, rather provide students with every detail to complete the assignment, participate in discussions, navigate the course, etc. Consider the possible questions students may ask about the materials and answer them before students have an opportunity to ask. Provide these answers within the course instructions and course announcements.
  2. Post Announcements. Remain present in the course by posting announcements. At the very least, post an announcement each week to wrap-up the previous week and let students know what to expect in the upcoming week. If possible, try to post at least two announcements per week. Announcements provide an opportunity to do some housecleaning. Provide reminders, clarification, and overviews to help engage and motivate students and helps them see that you’re involved in their learning.
Read more...

Additional resources


5 Time Management Tips For Managing An Online Classroom | eLearning Industry
 
Source: Faculty Focus 


If you enjoyed this post, make sure you subscribe to my Email Updates!

Five Fundamentals of Faculty Development | Faculty Focus

Photo: Patty Phelps
Dr. Patty Phelps, professor in the Department of Teaching & Learning at the University of Central Arkansas insist, "I am not a skilled athlete, but I have watched enough sporting events to know that the fundamentals are essential to both player and team success."

Photo: Faculty Focus

Coaches can often be heard repeating such maxims as “keep your eye on the ball,” “follow through,” and “hold your position.”

Faculty development has its own set of fundamentals. More than 20 years ago, I co-authored a grant establishing the faculty development center at the University of Central Arkansas. Over the years, I have served as faculty coordinator, co-director, and director. My experiences may benefit others who are working in the field or plan to in the future. Here are five fundamentals for designing and delivering effective faculty development:
  1. Begin with a clear vision. Almost every authority on leadership will mention the importance of creating a mental picture of your ideal future (i.e., a vision). As a starting point identify your core values. What ideals are most important to you and your institution? How do you see your role as a faculty developer (e.g., mentor, encourager, change agent, etc.)? What do you want faculty development to look and feel like on your campus? Gather input from center staff as well as your constituents. Incorporate these ideas into a brief, descriptive statement. This vision can then serve as a guide for future decisions and actions. (Note: Be sure to check for alignment with your institution’s mission.)
  2. Maintain the right perspective. In my session at the recent Teaching Professor Conference, I included a cell phone survey regarding effective faculty development. The most-missed survey question revealed that many faculty developers participating in the workshop viewed faculty development from a remedial perspective. This is a less than effective stance. Faculty who take advantage of professional development activities must not be seen as deficient. Rather than approaching faculty development as a way to “fix” designated faculty, recognize its potential to boost the instructional vitality of all faculty. When you see learning to teach as a lifelong process involving continual improvement, you are less inclined to take a remedial view of faculty development. The right perspective is one that is grounded in growth; it focuses on improving student learning, serves all campus faculty, and includes a variety of programs and services. No one group or type of individual is singled out. (Additional guidance: Weimer’s 2010 book, Inspired college teaching: A career-long resource for professional growth is an excellent resource to orient you toward the desired perspective.) 
Read more...

Source: Faculty Focus


If you enjoyed this post, make sure you subscribe to my Email Updates!

How do you keep your learning strategy flexible? | | TrainingZone and The Open University


Take a closer peek at this whitepaper from TrainingZone in association with The Open University below.

Download now

Learning and development is going through significant changes. This whitepaper, in association with The Open University, looks at what is driving this change, and provides practical advice on how L&D teams can support their businesses with a more flexible and agile approach to learning.

Download now to learn:
  • What is a flexible learning strategy? 
  • How to create one in eight easy steps 
  • The impact of 'learning science'
The Open University (OU) and TrainingZone writes, "We live and work in a VUCA world these days. VUCA stands for volatile, uncertain, complex and ambiguous and this new environment has changed the way that organisations, and all the functions within them, operate. That includes HR and L&D. Businesses need to be agile in order to gain and maintain a competitive advantage in this VUCA world. In fact, businesses need to be agile just to remain in business."
Download now

Source: TrainingZone


If you enjoyed this post, make sure you subscribe to my Email Updates!

Sunday, July 17, 2016

How the internet was invented | Technology | The Guardian

Photo: Ben Tarnoff

Please take a closer peek at this article as below by Ben Tarnoff, love history, science fiction, and California.
Photo: Danae Diaz at PVUK

In the kingdom of apps and unicorns, Rossotti’s is a rarity. This beer garden in the heart of Silicon Valley has been standing on the same spot since 1852. It isn’t disruptive; it doesn’t scale. But for more than 150 years, it has done one thing and done it well: it has given Californians a good place to get drunk.

During the course of its long existence, Rossotti’s has been a frontier saloon, a gold rush gambling den, and a Hells Angels hangout. These days it is called the Alpine Inn Beer Garden, and the clientele remains as motley as ever. On the patio out back, there are cyclists in spandex and bikers in leather. There is a wild-haired man who might be a professor or a lunatic or a CEO, scribbling into a notebook. In the parking lot is a Harley, a Maserati, and a horse.

It doesn’t seem a likely spot for a major act of innovation. But 40 years ago this August, a small team of scientists set up a computer terminal at one of its picnic tables and conducted an extraordinary experiment. Over plastic cups of beer, they proved that a strange idea called the internet could work.

 A plaque at Rossotti’s commemorating the August 1976 experiment.
Photograph: Courtesy of the Alpine Inn Beer Garden, formerly Rossotti's
The internet is so vast and formless that it’s hard to imagine it being invented. It’s easy to picture Thomas Edison inventing the lightbulb, because a lightbulb is easy to visualize. You can hold it in your hand and examine it from every angle.

The internet is the opposite. It’s everywhere, but we only see it in glimpses. The internet is like the holy ghost: it makes itself knowable to us by taking possession of the pixels on our screens to manifest sites and apps and email, but its essence is always elsewhere.

This feature of the internet makes it seem extremely complex. Surely something so ubiquitous yet invisible must require deep technical sophistication to understand. But it doesn’t. The internet is fundamentally simple. And that simplicity is the key to its success.

The people who invented the internet came from all over the world. They worked at places as varied as the French government-sponsored computer network Cyclades, England’s National Physical Laboratory, the University of Hawaii and Xerox. But the mothership was the US defense department’s lavishly funded research arm, the Advanced Research Projects Agency (Arpa) – which later changed its name to the Defense Advanced Research Projects Agency (Darpa) and its many contractors. Without Arpa, the internet wouldn’t exist.

As a military venture, Arpa had a specifically military motivation for creating the internet: it offered a way to bring computing to the front lines. In 1969, Arpa had built a computer network called Arpanet, which linked mainframes at universities, government agencies, and defense contractors around the country. Arpanet grew fast, and included nearly 60 nodes by the mid-1970s.

But Arpanet had a problem: it wasn’t mobile. The computers on Arpanet were gigantic by today’s standards, and they communicated over fixed links. That might work for researchers, who could sit at a terminal in Cambridge or Menlo Park – but it did little for soldiers deployed deep in enemy territory. For Arpanet to be useful to forces in the field, it had to be accessible anywhere in the world.

Picture a jeep in the jungles of Zaire, or a B-52 miles above North Vietnam. Then imagine these as nodes in a wireless network linked to another network of powerful computers thousands of miles away. This is the dream of a networked military using computing power to defeat the Soviet Union and its allies. This is the dream that produced the internet.

Making this dream a reality required doing two things. The first was building a wireless network that could relay packets of data among the widely dispersed cogs of the US military machine by radio or satellite. The second was connecting those wireless networks to the wired network of Arpanet, so that multimillion-dollar mainframes could serve soldiers in combat. “Internetworking,” the scientists called it.

Internetworking is the problem the internet was invented to solve. It presented enormous challenges. Getting computers to talk to one another – networking – had been hard enough. But getting networks to talk to one another – internetworking – posed a whole new set of difficulties, because the networks spoke alien and incompatible dialects. Trying to move data from one to another was like writing a letter in Mandarin to someone who only knows Hungarian and hoping to be understood. It didn’t work.

In response, the architects of the internet developed a kind of digital Esperanto: a common language that enabled data to travel across any network. In 1974, two Arpa researchers named Robert Kahn and Vint Cerf published an early blueprint. Drawing on conversations happening throughout the international networking community, they sketched a design for “a simple but very flexible protocol”: a universal set of rules for how computers should communicate.

These rules had to strike a very delicate balance. On the one hand, they needed to be strict enough to ensure the reliable transmission of data. On the other, they needed to be loose enough to accommodate all of the different ways that data might be transmitted.

Vinton Cerf, left, and Robert Kahn, who devised the first internet protocol.
Photograph: Louie Psihoyos/Corbis

“It had to be future-proof,” Cerf tells me. You couldn’t write the protocol for one point in time, because it would soon become obsolete. The military would keep innovating. They would keep building new networks and new technologies. The protocol had to keep pace: it had to work across “an arbitrarily large number of distinct and potentially non-interoperable packet switched networks,” Cerf says – including ones that hadn’t been invented yet. This feature would make the system not only future-proof, but potentially infinite. If the rules were robust enough, the “ensemble of networks” could grow indefinitely, assimilating any and all digital forms into its sprawling multithreaded mesh.

Eventually, these rules became the lingua franca of the internet. But first, they needed to be implemented and tweaked and tested – over and over and over again. There was nothing inevitable about the internet getting built. It seemed like a ludicrous idea to many, even among those who were building it. The scale, the ambition – the internet was a skyscraper and nobody had ever seen anything more than a few stories tall. Even with a firehose of cold war military cash behind it, the internet looked like a long shot.

Then, in the summer of 1976, it started working.
Read more...

Source: The Guardian


If you enjoyed this post, make sure you subscribe to my Email Updates!

Saturday, July 16, 2016

Your midsummer ed-tech check-in | Laura Devaney, Director of News, K-12 and Higher Education


Follow on Twitter as @eSN_Laura
"In this week's news, we take a look back at some of the most popular summer stories up until now. Stay tuned for what we have planned for the rest of this summer." report


Catch up on the most compelling higher-ed news stories you may have missed this week. 

Each Friday, Laura Devaney will be bringing you a recap of some of the most interesting and thought-provoking news developments that occurred over the week. 

I can’t fit all of our news stories here, though, so feel free to visit eCampusNews.com and read up on other news you may have missed.



Photo: eCampus News

Why Teachers College blew up its wireless network
What would you do if you could blow up your school’s entire wireless network and start all over again? That’s exactly what Teachers College, Columbia University has done, as the storied institution seeks to position itself for the 21st century, complete with cutting-edge online instruction and highly automated classrooms.

Read more...
 
30 technologies ushering in the future for education
If the bright lights and casino bling of Las Vegas weren’t enough to dazzle conference-goers from around the world, the awe-inspiring displays of futuristic technology littering the Infocomm 2016 show floor certainly were, such as Epson’s trippy Infinity Room and LG’s mind-bending OLED curved tiling display.

Read more...

Students say campus technology needs major overhaul–but why?
According to a new report, thanks to a lack of digital options and tedious online protocols part of many campus technology initiatives, students say they study less and think less of their university.

Read more...
 

Could these 3 burgeoning nontraditional pathways be a boon for traditional institutions?
Unconventional education organizations say recent high school grads flocking to nontraditional pathways; could this be good for college and university admissions?

Read more...

Additional resources
Catch up on the most compelling K-12 news stories you may have missed this week. 


Photo: eSchool News

Let’s take a look back at some of the most popular stories from the first half of this summer. What do they have in common? They all help classroom teachers and education leaders leverage education technology in the best ways possible.

Your midsummer ed-tech check-in  | eSchool News 

Source: eCampus News 


If you enjoyed this post, make sure you subscribe to my Email Updates!

The next wave of disruption: Graph-based machine learning | IDG Connect

Kathryn Cave, Editor at IDG Connect notes, "We look at the pros and cons of machine learning and graph technology and how the two are now working together."

Photo: IDG Connect

Machine learning (ML) is getting a lot of attention at the moment. This is partly because a slew of new companies are emerging which are using it in innovative ways. And partly because it can get easily subsumed into the fuss and furore about AI and the rise of evil robot intelligence. Graph technology, on the other hand, is something which takes more of a back seat and yet, in a lot of ways, also sits at the forefront of the big data and analytics movement. 

“We firmly believe is that it's at the intersection of machine learning and graph technology where the next evolution lies and where new disruptive companies are emerging,” says Ash Damle, Founder and CEO at Lumiata which helps healthcare organisations makes predictions.

“It's only recently that companies can use graph at true scale and, now, by integrating with ML, we're moving much more into a core understanding of artificial intelligence, deep neural networks and image recognition.”
 
So, in the simplest terms what are these two technologies?
 
At the most basic level machine learning takes large quantities of data to make predictions about future events. While graph technology is more concerned with the relationship between different data points.

Claus Jepsen, Chief Architect, R&D at Unit4 which provides enterprise applications, summarises:

“Machine Learning is really the umbrella and graph technology is a way of representing data when using machine learning.”

While Peter Duffy, CTO of capacity planning as a service provider, Sumerian adds, this means: “There is huge potential for businesses to take advantage of both.”

David Thompson, Sr. Director of Product Management at LightCyber further clarifies: “Graph technology can be considered a type or technique of machine learning, or, at a minimum, aspects of graph technology have strong application to machine learning.”
Read more... 

Source: IDG Connect


If you enjoyed this post, make sure you subscribe to my Email Updates!

How a course about violence changed the way students are taught and assessed | Times Higher Education

Photo: William Watkin
Photo: Will Self
"A literature module developed at Brunel University London has moved away from the traditional essay format and embraced the digital age" according to William Watkin, professor of contemporary literature and philosophy at Brunel University London and Will Self, professor of contemporary thought at Brunel University London and is the author of nine novels.

‘I saw a strangely zombie-like response to the gathering impact of bi-directional digital media’
 

When I began teaching at Brunel University London five years ago I’d had little but glancing contact with the academy since I left higher education myself. My experience of reading for a politics, philosophy and economics degree at the University of Oxford in the late 1970s and early 1980s, may, even for the time, have embodied anachronisms − but the theory and practice of arts and humanities pedagogy I found at Brunel in 2011 remained in essence the same.

Photo: iStock/Getty montage
 At the core of it all, it seems to me, lies the text. At Oxford, I often studied in Duke Humfrey’s library, the dark and woody 14th-century cell deep in the dense honeycomb of the Bodleian. Here, surrounded by ancient tomes, I did my best to impress upon my memory the outline of the canon while at the same time shading in a fraction of its content.

The entire system of learning at Oxford, so far as I can recall, consisted of the combination of mnemonics, composition and argumentation. Reading lists were prodigious: often 20 or 30 items − both entire volumes and journal articles – so redundancy was a given: hours needed to be spent in the library to extract the pith from acres of paper. I took two courses (as modules were then called) every term, and the coursework requirement was an essay of 3,000 words per week for each of them; the sheer amount I had to write gave me the core facility needed for an entire adult working life as a professional writer.

The argumentation was, of course, astonishingly thorough when compared with the meagre “contact hours” most contemporary students are mandated: a full hour vis-à-vis, usually one-to-one, reading out your essay and then picking it apart. Lectures were plentiful and accessible, but I confess: with two hours of tutorials a week, and a minimum of 12 to write my papers, I needed all the remaining ones simply in order to read, if I were to be able to absorb sufficient information to substantiate the sort of large-scale theoretical paradigms I was being introduced to.

As I say, when I arrived at Brunel I found the lineaments of this system still present: reading lists and essay assignments; lectures, seminars and tutorials. I also realised immediately the deep commitment many of my colleagues had to serious, effective pedagogy, and their preparedness to do right by students facing massively increased pressures owing to the marketisation of the sector. However, what I also saw (and I believe this, in part, to be one of the many unforeseen consequences of the 2010 Browne “reforms”, which ushered in the tripling of tuition fees) was a strangely zombie-like response to the gathering impact of bi-directional digital media (BDDM) on the study of arts and the humanities. This is at once a vast subject − and just one aspect of the technological revolution we’re living through; one of such scale, rapidity and obvious transformative potential, it deserves to − and does − generate ever more baroque and reflexive forms of appraisal and criticality. That being noted, there are simple things to be said, and for me they coalesce around a single conceptual object: the skeuomorph.

‘My students chose to look away not out of fear or moral revulsion but out of ennui’
 

This was the first year of my new, final-year literature course at Brunel, entitled Violence. It is, I believe, a unique course in the UK, allowing English literature students the opportunity to engage with the art, theory, politics and technology of violence.

Photo: Getty/Alamy/iStock montage
The course is an innovation in how to study literary production in that it is not determined by a particular period, “ism” or theoretical approach. It is not even dominated by the need to study works of literature, taking, as it does, textual study in its absolutely widest possible sense. Novels are texts; paintings are text; philosophy is text; film is text. But YouTube is also text; Twitter is text; wolf whistles can be text. In the final week of the course, the reading set for the class said simply: “the internet”.

The idea behind the course was to let urgent issues dictate the nature of its design, rather than apply already tested frameworks of study to, say, representations of violence online. I envisage that this may be one way forward for literary studies and the humanities in general: let the topic dictate the course and then discover new methods and materials to answer its call. It is a form of curriculum design not dissimilar to the much-discussed phenomenon-based learning, which is centred on real-world phenomena rather than the abstractions studied by traditional academic subjects. Areas we covered included founding violence (or the role of dramatic acts of violence in founding and protecting states), discursive violence (how language and representation can be a form of violence), animal violence, sexual violence, sadism, cannibalism, scapegoating, punishment, surveillance, decapitation videos and, of course, zombies.

I also attempted to develop novel ways to deliver and assess the content. The lecturing team – which included my colleagues Will Self and film studies lecturer Daniele Rugo – were given three hours to present, interact, challenge and discuss whatever material they thought fit. I wrote original pieces of journalism, spoke from my blogs and used a complex set of platforms, including YouTube clips, feature films, internet image curation and Twitter, to discuss, say, the concept of founding violence or capital punishment. At one stage, I presented my zombie walk to introduce a discussion of the ontological issues surrounding the undead. This wasn’t just a joke. Performative lecturing techniques are essential if we want to keep the students in class and off their phones. A middle-aged man who should know better doing a, frankly, rather brilliant, zombie shuffle which segued into an admittedly inexpert moonwalk (the link being Michael Jackson’s Thriller video) is at least one way to get the students’ attention. But don’t worry: in the same session we also studied the Italian philosopher Giorgio Agamben’s ruminations on the homo sacer (a Roman criminal whom anyone was permitted to kill without being considered a murderer). I am not a complete charlatan.
Read more... 

Source: Times Higher Education


If you enjoyed this post, make sure you subscribe to my Email Updates!