Our guide to computing courses


Computers are everywhere nowadays. They are fundamental components of every company and institution in the land, with pivotal roles to play in everything from energy supply and healthcare through to retail and design. There are few jobs in modern Britain that don’t involve daily computer usage, and few homes that still lack the processing power of a laptop, desktop, tablet or smartphone.


So why do computers matter?

Computers are unquestionably here to stay, and their influence on our lives grows greater with every passing day. However, computers can be daunting to anyone who has never had the desire or opportunity to use one before. A computing course can therefore be absolutely invaluable in opening up a whole new world of work, leisure and communications – even for someone who has never sat down at a keyboard or touched a screen before.


Am I really up to it?

One of the greatest attributes of modern computers is their ever-increasing levels of usability. Not so long ago, a computer would simply display a blank screen and a flashing cursor when it was switched on, whereas today, tutorials and user guides will help rookies through each task and program. Apple Mac computers are especially easy to operate, and software packages typically group the most important tasks together (like starting a new file or making a copy of an existing one), enabling novices to navigate their way around.

For people who don’t know one end of a keyboard from the other, a computing course is pretty much an essential learning tool, and it is only going to become more so with every passing year. It will enable anyone who completes it to avoid being left behind, as we continue to live in an age of omnipresent technology and digital communications.


What will I learn?

A computing course will typically cover the most basic functions of a computer, as well as explaining a little about how it actually works, such as the difference between hardware (the mechanical and electrical components) and software (programs that enable computers to perform specific tasks). Beginners might also be advised to consider a typing course or a word processing course, as keyboards are essential elements of modern computer use, but their layout is hardly intuitive.

Once you have completed your course, you should be able to use peripherals like a printer and possibly a scanner. You will understand a little more about how computers function, as well as being able to use a keyboard and mouse to open and close programs, access the internet and probably send an email. However, one of the most fascinating aspects of computing is the endless scope to learn new skills and advance your knowledge further with courses in related areas such as desktop publishing or multimedia computing.


How did it all start?

The history of computers is muddied by disagreement about what qualifies under this catch-all term. Some historians argue that the Babbage difference engine was the Genesis of computers, way back in the mid-19th century, even though it was basically a mechanical calculator. However, a more commonly agreed starting point is the first fully programmable computer, which was launched in 1936, while IBM’s arrival in 1953 (coupled with the invention of the transistor) spearheaded the launch in 1954 of the first computer to use a programming language.

The first recognised computer game debuted in 1962, with a crude forerunner of today's ubiquitous mouse arriving two years after. The 1970s saw microprocessors, floppy disc drives and Ethernet cables arrive, with the first domestic computers available towards the end of the decade. By the early 1980s, Microsoft had unveiled MS DOS (the precursor of today's iconic Windows), and Apple arrived on the scene in 1984 with rival hardware and operating systems. Finally, and perhaps most significantly, the Internet followed in the early 1990s, based on an old American military mobile communications system, and promptly ushered the world into the communications age.


Fascinating facts about the rise of computers

·         In 1943, the head of IBM is alleged to have said: “I think there is a world market for maybe five computers”. Today, with many first-world homes containing several desktops, laptops, smartphones and tablets, it is impossible to accurately say how many computers are in existence.

·         Modern computer programs are typically pre-installed, but older hardware required programs to be loaded in from an external storage source. This involved a variety of portable devices ranging from CDs and cassettes through to lengthy hole-punched rolls of paper, with each hole representing a single piece of information.

·         Despite their incredible complexity, computers are entirely binary. That means everything they do is governed by a series of ones and zeroes. An astonishing number of calculations are taking place every second to allow people to play games or carry out work, and computers are constantly becoming faster and more powerful.


By Neil Cumins

Search computing courses now Read less

Find Computing courses near you