“C” as Part of a Mechanical Engineering Curriculum

Most engineering programs expect undergraduates to take computer programming, but requirements vary widely. My institution, the University of California, Davis, requires electrical engineering students to take four programming courses, but mechanical engineering students take just one course. Which language should students learn? As more mechanical devices add electronic controls, the choice of language becomes more critical.

In 1998 Matlab replaced the more traditional FORTRAN as one of the four required courses for UC Davis electrical engineers and in the mechanical engineering curriculum as well. But after four years, our students’ programming skills had declined compared with those who had taken FORTRAN. In one project, the design of a robot for gathering samples on Mars, only students proficient in the C programming language could program the specified Atmel 8-bit microcontroller. After noting that Matlab alone was insufficient for serious programming, we redesigned our curriculum in 2003 to combine C with an introduction to Matlab.

Why C?

An introductory programming course should use a non-proprietary programming language that adheres to an international standard. A standardized language is stable, and its evolution is supported and maintained by industry and overseen by technical standards committees and other stakeholders. As a language, C continues to evolve but remains backward compatible. As long as it conforms to the C99 standard, a compiler will work with programs written in C89. Matlab, by contrast, is a proprietary mathematical programming language that makes collaboration difficult with individuals not running Matlab. C has arguably become the most common programming language, both in engineering and elsewhere. More than 90 percent of desktop computer programs, from operating systems to word processors, are written in C or its relative, C++. C runs on all platforms, and most other languages may be translated into C. In the Programming Language Popularity Website, C tops the list, while C++ is fourth. FORTRAN is No. 21 and Matlab is nowhere to be seen.

C is especially useful for mechanical engineers because it is the language of choice for hardware interfaces, and commonly used for data acquisition and real-time robotic control. C is also the most widely used language for programming embedded processors: Of the 9 billion microprocessors manufactured in 2005, 8.8 billion were embedded. Despite experiencing a somewhat steep learning curve, students of C gain valuable knowledge of data types, compiling, linking, and optimization, and receive a solid foundation for acquiring advanced programming skills. Once students know C, they can learn other languages more easily, particularly those that borrow heavily from C. Users can either compile or interpret a C program. C interpreters let students execute a single line without compilation, thus providing immediate feedback. Some C interpreters also contain graphical plotting and advanced numerical computing capabilities typically found in mathematical programming languages.

Teaching C in Context

Just as learning foreign languages helps students understand their native tongue, learning C with other languages sheds light on the fundamentals of computer programming. For example FORTRAN, which dates back to the 1950s, remains one of the primary professional programming languages, especially for such computationally intensive programs as computational fluid dynamics. FORTRAN is therefore one of the best candidates for mechanical engineering students to compare with C. C99, ratified in 1999, includes features that enable it to be optimized as efficiently as the equivalent FORTRAN programs. C99 also supports complex numbers and variable length arrays that are useful in engineering and science.
An introductory programming course should focus on problem-solving. Our course, which runs for one academic quarter (10 weeks), must cover a lot of ground. Due to time constraints, we teach students both C and Matlab, and provide handouts on FORTRAN as a second programming language. Their solid foundation in C helps our students learn Matlab quickly. We demonstrate the strength and some unique features of Matlab by having students use it to re-solve many of the same problems that they tackled earlier while learning C.

With a solid foundation in C, mechanical engineering students are well prepared for today’s projects, which increasingly integrate mechanical hardware with control software. Students acquire the foundation to learn more advanced, mathematical programming languages, and to take advantage of new and emerging computing paradigms- Adapted from “C for the Course” by Harry H. Cheng, University of California, Davis, for Mechanical Engineering, September 2009

Best Programming Language for Robotics

In this post, we’ll look at the top 10 most popular programming languages used in robotics. We’ll discuss their strengths and weaknesses, as well as reasons for and against using them.

It is actually a very reasonable question. After all, what’s the point of investing a lot of time and effort in learning a new programming language, if it turns out you’re never going to use it? If you are a new roboticist, you want to learn the programming languages which are actually going to be useful for your career.

Why “It Depends” is a Useless Answer

Unfortunately, you will never get a simple answer if you asked “What’s the best programming language for robotics?” to a whole roomful of robotics professionals (or on forums like Stack OverflowQuoraTrossenReddit or Research Gate). Electronic engineers will give a different answer from industrial robotic technicians. Computer vision programmers will give a different answer than cognitive roboticists. And everyone would disagree as to what is “the best programming language”. In the end, the answer which most people would all agree with is “it depends.” This is a pretty useless answer for the new roboticist who is trying to decide which language to learn first. Even if this is the most realistic answer because it does depend on what type of application you want to develop and what system you are using.

Which Programming Language Should I Learn First?

It’s probably better to ask, which programming language is the one you should start learning first? You will still get differing opinions, but a lot of roboticists can agree on the key languages. The most important thing for roboticists is to develop “The Programming Mindset” rather than to be proficient in one specific language. In many ways, it doesn’t really matter which programming language you learn first. Each language that you learn develops your proficiency with the programming mindset and makes it easier to learn any new language whenever it’s required.

Top 10 Popular Programming Languages in Robotics

There are over 1500 programming languages in the world, which is far too many to learn. Here are the ten most popular programming languages in robotics at the moment. If your favorite language isn’t on the list, please tell everyone about it in the comments! Each language has different advantages for robotics. The way I have ordered them is only partly in order of importance from least to most valuable.

10. BASIC / Pascal

BASIC and Pascal were two of the first programming languages that I ever learned. However, that’s not why I’ve included them here. They are the basis for several of the industrial robot languages, described below. BASIC was designed for beginners (it stands for Beginners All-Purpose Symbolic Instruction Code), which makes it a pretty simple language to start with. Pascal was designed to encourage good programming practices and also introduces constructs like pointers, which makes it a good “stepping stone” from BASIC to a more involved language. These days, both languages are a bit outdated to be good for “everyday use”. However, it can be useful to learn them if you’re going to be doing a lot of low level coding or you want to become familiar with other industrial robot languages.

9. Industrial Robot Languages

Almost every robot manufacturer has developed their own proprietary robot programming language, which has been one of the problems in industrial robotics. You can become familiar with several of them by learning Pascal. However, you are still going to have to learn a new language every time you start using a new robot. ABB has its RAPID programming language. Kuka has KRL (Kuka Robot Language). Comau uses PDL2, Yaskawa uses INFORM and Kawasaki uses AS. Then, Fanuc robots use Karel, Stäubli robots use VAL3 and Universal Robots use UR Script. In recent years, programming options like ROS Industrial have started to provide more standardized options for programmers. However, if you are a technician, you are still more likely to have to use the manufacturer’s language.


LISP is the world’s second oldest programming language (FORTRAN is older, but only by one year). It is not as widely used as many of the other programming languages on this list; however, it is still quite important within Artificial Intelligence programming. Parts of ROS are written in LISP, although you don’t need to know it to use ROS.

7. Hardware Description Languages (HDLs)

Hardware Description Languages are basically a programming way of describing electronics. These languages are quite familiar to some roboticists, because they are used to program Field Programmable Gate Arrays (FPGAs). FPGAs allow you to develop electronic hardware without having to actually produce a silicon chip, which makes them a quicker and easier option for some development. If you don’t prototype electronics, you may never use HDLs. Even so, it is important to know that they exist, as they are quite different from other programming languages. For one thing, all operations are carried out in parallel, rather than sequentially as with processor-based languages.

6. Assembly

Assembly allows you to program at “the level of ones and zeros”. This is programming at the lowest level (more or less). In the recent past, most low level electronics required programming in Assembly. With the rise of Arduino and other such microcontrollers, you can now program easily at this level using C/C++, which means that Assembly is probably going to become less necessary for most roboticists.


MATLAB, and its open source relatives, such as Octave, is very popular with some robotic engineers for analyzing data and developing control systems. There is also a very popular Robotics Toolbox for MATLAB. I know people who have developed entire robotics systems using MATLAB alone. If you want to analyze data, produce advanced graphs or implement control systems, you will probably want to learn MATLAB.

4. C#/.NET

C# is a proprietary programming language provided by Microsoft. I include C#/.NET here largely because of the Microsoft Robotics Developer Studio, which uses it as its primary language. If you are going to use this system, you’re probably going to have to use C#. However, learning C/C++ first might be a good option for long-term development of your coding skills.

3. Java

As an electronics engineer, I am always surprised that some computer science degrees teach Java to students as their first programming language. Java “hides” the underlying memory functionality from the programmer, which makes it easier to program than, say, C, but also this means that you have less of an understanding of what it’s actually doing with your code. If you come to robotics from a computer science background (and many people do, especially in research) you will probably already have learned Java. Like C# and MATLAB, Java is an interpretive language, which means that it is not compiled into machine code. Rather, the Java Virtual Machine interprets the instructions at runtime. The theory for using Java is that you can use the same code on many different machines, thanks to the Java Virtual Machine. In practice, this doesn’t always work out and can sometimes cause code to run slowly. However, Java is quite popular in some parts of robotics, so you might need it.

2. Python

There has been a huge resurgence of Python in recent years especially in robotics. One of the reasons for this is probably that Python (and C++) are the two main programming languages found in ROS. Like Java, it is an interpretive language. Unlike Java, the prime focus of the language is ease of use. Many people agree that it achieves this very well. Python dispenses with a lot of the usual things which take up time in programming, such as defining and casting variable types. Also, there are a huge number of free libraries for it, which means you don’t have to “reinvent the wheel” when you need to implement some basic functionality. And since it allows simple bindings with C/C++ code, this means that performance heavy parts of the code can be implemented in these languages to avoid performance loss. As more electronics start to support Python “out-of-the-box” (as with Raspberry Pi), we are likely to see a lot more Python in robotics.

1. C/C++

Finally, we reach the Number 1 programming language in robotics! Many people agree that C and C++ are a good starting point for new roboticists. Why? Because a lot of hardware libraries use these languages. They allow interaction with low-level hardware, allow for real-time performance and are very mature programming languages. These days, you’ll probably use C++ more than C, because the language has much more functionality. C++ is basically an extension of C. It can be useful to learn at least a little bit of C first, so that you can recognize it when you find a hardware library written in C. C/C++ are not as simple to use as, say, Python or MATLAB. It can take quite a lot longer to implement the same functionality using C and it will require many more lines of code. However, as robotics is very dependent on real-time performance, C and C++ are probably the closest things that we roboticists have to “a standard language”

Source:- Alex Owen-Hill/blog.robotiq.com


Google’s RankBrain Algorithm

RankBrain is an algorithm learning artificial intelligence system, the use of which was confirmed by Google on 26 October 2015. [1]It helps Google to process search results and provide more relevant search results for users.[2]In a 2015 interview, Google commented that RankBrain was the third most important factor in the ranking algorithm along with links and content.[2]As of 2015, “RankBrain was used for less than 15% of queries.” [3]The results show that RankBrain produces results that are well within 10% of the Google search engine engineer team.[4]

If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries or keywords.[5]Search queries are sorted into word vectors, also known as “distributed representations,” which are close to each other in terms of linguistic similarity. RankBrain attempts to map this query into words (entities) or clusters of words that have the best chance of matching it. Therefore, RankBrain attempts to guess what people mean and records the results, which adapts the results to provide better user satisfaction.[6]

There are over 200 different ranking factors[7]which make up the ranking algorithm, whose exact functions in the Google algorithm are not fully disclosed. Behind content and links,[8]RankBrain is considered the third most important signal in determining ranking on Google search.[9][3]Although Google has not admitted to any order of importance, only that RankBrain is one of the three most important of its search ranking signals.[10]When offline, RankBrain is given batches of past searches and learns by matching search results. Studies showed how RankBrain better interpreted the relationships between words. This can include the use of stop words in a search query (“the,” “and,” without,” etc) – words that were historically ignored previously by Google but are sometimes of a major importance to fully understanding the meaning or intent behind a person’s search query. It’s also able to parse patterns between searches that are seemingly unconnected, to understand how those searches are similar to each other.[11]Once RankBrain’s results are verified by Google‘s team the system is updated and goes live again.[12]In August 2013, Google has published a post about how they use AI for learning searcher intention.[13]

Google has stated that it uses tensor processing unit (TPU) ASICs for processing RankBrain requests.[14]

Impact on digital marketing

RankBrain has allowed Google to speed up the algorithmic testing it does for keyword categories to attempt to choose the best content for any particular keyword search. This means that old methods of gaming the rankings with false signals are becoming less and less effective and the highest quality content from a human perspective is being ranked higher in Google [15]

  1. https://searchengineland.com/library/google/google-rankbrain
  2. Clark, Jack. “Google Turning Its Lucrative Web Search Over to AI Machines”. Bloomberg Business. Bloomberg. Retrieved 28 October 2015.
  3. “Google uses RankBrain for every search, impacts rankings of “lots” of them”. Search Engine Land. 2016-06-23. Retrieved 2017-04-14.
  4. “Google RankBrain 權威指南 | Whoops SEO”. seo.whoops.com.tw (in Chinese). Retrieved 2018-01-15.
  5. “Google Turning Its Lucrative Web Search Over to AI Machines”. Surgo Group News. Retrieved 5 November 2015.
  6. Capala, Matthew (2016-09-02). “Machine learning just got more human with Google’s RankBrain”. The Next Web. Retrieved 2017-01-19.
  7. “Google’s 200 Ranking Factors: The Complete List”. Backlinko (Brian Dean). 2013-04-18. Retrieved 2016-04-12.
  8. “Rankbrain 2017”. Pay-Website (Edith). 2017-05-12. Retrieved 2017-08-21.
  9. “Now we know: Here are Google’s top 3 search ranking factors”. Search Engine Land. 2016-03-24. Retrieved 2017-04-14.
  10. “Google Releases the Top 3 Ranking Factors | SEJ”. Search Engine Journal. 2016-03-25. Retrieved 2017-04-14.
  11. “The real impact of Google’s RankBrain on search traffic”. The Next Web. Retrieved 2017-05-22.
  12. Sullivan, Danny. “FAQ: All About The New Google RankBrain Algorithm”. Search Engine Land. Retrieved 28 October 2015.
  13. “Google RankBrain 權威指南 | Whoops SEO”. seo.whoops.com.tw (in Chinese). Retrieved 2018-04-26.
  14. “Google’s Tensor Processing Unit could advance Moore’s Law 7 years into the future”. PCWorld. Retrieved 2017-01-19.
  15. “NonTechie RankBrain Guide [Infographic]”. http://www.logicbasedmarketing.com. Retrieved 2018-02-16.

AI in CNC machining

Artificial engineering and artificial intelligence have found their way in almost all industries. When mixed with controlled machining or CNC machining, AI can help remove the manual labor in redundant tasks. Generally, the algorithm of the software can be designed in such a way that after receiving the feedback in a specific situation, the decision can be actualized with or without human consultation. In case of repetitive tasks, requiring no consultation, the software and execute the required steps, eliminating manual labor.

For example, with precision CNC machining, you can design a program to shut off a car if it is left unattended for a few minutes. If you leave a car running in the parking lot, or your garage, an embedded code of AI might leave you a message to alert you that the car is on. If there is no response on the owner’s part (your part), the algorithm will dictate the engine to be shut off after 8 minutes.

The CNC machining algorithm can be used in other industries also, like smartphones, which can help you provide situational alerts, or devices, which will help you, understand the dangers surrounding you.

So, now that you know about the different industries, which can use artificial intelligence, let’s have a look how AI can help you speed up your work. Here are some things artificial intelligence can do better than human beings can.

Search the Internet

Well, most of us have heard about Google’s RankBrain algorithm. Do you know that it has been developed based on artificial intelligence? It is actually a machine learning based artificial intelligence, which handles all the search queries for Google. Since RankBrain understands words and phrases, it can easily predict the top ranking pages, compared to human counterparts. Although it is being tweaked even now, the base algorithm for Google’s RankBrain remains unchanged.

Work In Inhumane Conditions

Well, robots don’t have feelings. This is why they can survive in places without oxygen or where no human beings can survive. This is why artificial intelligence is essential for surveillance in deep oceanic trenches, radioactive locations or even in outer space. The only problem associated with CNC machining in AI is that it waits for human interruption in certain crucial decision making. This makes the process not only time consuming, but also to some extent useless when the AIs need to function independently.

If you own an iPhone or even Windows 10 phone or OS, you’d have come across Siri and Cortana. Both of them are some modifications of artificial intelligence helping you achieve what you need. In fact, if you use Google’s voice assistant in any of your Android devices, you’d see how Google responds to your set of commands, and even opens your task list and checks off tasks as per your command. While this might seem freaky and surreal, with a proper set of coded instructions, any artificial intelligence can achieve the desired result you’re looking for. All it needs is a decision making loop.

Artificial Intelligence in Medical Science

One of the biggest achievements of artificial intelligence is perhaps its progress in the medical field. Even if a physician has enough exposure to patients, proper and accurate diagnosis can be a problem. However, with the artificial intelligence in place, the process can not only becomes smoother but also more accurate.

On average, a physician spends 160 hours per month to keep track of the latest medical breakthroughs. As such, remembering the breakthroughs and the latest symptoms of a patient, and applying them in regular diagnosis can become problematic for human brain. Compared to this, IBM Watson can do the proper diagnosis with a fraction of a second. Additionally, the AI’s accuracy rate for diagnosis of lung cancer is 90%, which is quite high compared to the accuracy rate of veteran human physicians (50%).

At the end of the day, even if AI has found its way in different industries, it cannot replace human intelligence totally because of its lack of general reasoning. While a robot or AI is perfect in doing each of the tasks outline above individually, the same might not be true when it comes to completing a different set of tasks, which it has not been programmed for. This makes it tough for AI to replace human intelligence, unless more research is done on it.

Forget the Robot Singularity Apocalypse.

FOR A SPECIES that’s conquered Earth and travelled through space and invented the Slapchop, we humans sure are insecure when it comes to technology. Our greatest fear: the singularity, when the abilities of AI and robots surpass those of humans, growing so advanced that civilization is forced to reboot as humanity spirals into existential dread. Or worse, the machines turn us into batteries, à la The Matrix.

As fun as that all sounds, UC Berkeley roboticist Ken Goldberg thinks the singularity is bunk. “I think it’s counterproductive,” he says. “I think it’s demoralizing and it’s fiction. We’re not even close to this.”

The robot revolution we are in the midst of is actually way more interesting. Goldberg calls it the multiplicity. “Multiplicity is not science fiction,” he says. “It’s something that’s happening right now, and it’s the idea of humans and machines working together.” So welcome to the future, where robots do things like gently hand us screwdrivers instead of stabbing us with them.

You, my friend, are already part of the multiplicity. When you jump in your car and boot up Google Maps (or Apple Maps, if you’re a glutton for punishment) and let algorithms guide you to your destination, you’re collaborating with a machine. You may even have a car that drives for you on the highway—in which case, you’re not just collaborating with a machine, you’re entrusting it with your life.

As the machines grow more sophisticated, so too will our interactions with them. Truly self-driving cars you can buy and have the shuttle you around are probably decades away, but in the meantime, you’ll likely drive a car that does a portion of the driving for you. What you don’t want it to do, though, is suddenly freak out when it’s not confident it can handle a situation and start flashing alerts at you. You want it to communicate consistently.

“All along it’s sort of saying, ‘Hey, you know, it’s starting to get a little overcast, I’m getting a little uncomfortable. Can you sort of tune in?’” says Goldberg. “So it’s kind of keeping you informed before the crisis.”

The very nature of work, too, is transforming in the age of multiplicity. Recently, robots have escaped the lab and the factory to work alongside humans, thanks largely to sensors like lidar that have become both cheaper and more sophisticated. Security robots, for instance, supplement human guards. Hospital robots deliver drugs and linens to nurses. And at Walmart, a towering robot rolls through the aisles scanning shelves to do inventory.

Very few robots out there are meant to actually replace human labour, and there’s little research to suggest that the jobless future is nigh. “It’s not a desire to simply take people’s jobs away,” says Goldberg. “We want to enhance people, and we want them to be able to focus on the more subtle, rewarding, and human aspects of their jobs.” Benevolent capitalism this is not—it’s the idea that humans can triumph where robots fail, and vice versa. Robots are great at brute strength, precision, and speed. Humans have better brains and marvellous hands with which to grip an array of objects. And these contracts are going to stay contrasts for a long while to come.

A great example of multiplicity in action is what Amazon is doing. In its fulfilment centers, it employs 100,000 robots that autonomously deliver products to humans, who then pack the boxes that go out to customers. So here the tireless robots do the dull task of hurrying around the warehouse, while the humans handle the complex manipulations that would confound a robot. Hell, the robots don’t even have the hands (or end effectors, in robot lingo) required to grasp things.

“The way we have to start thinking about robots is not as a threat, but as something that we can work within a collaborative way,” says Goldberg. “And so a lot of it is changing our own attitudes.”

Which may be a big ask, given the pervasive narrative of the job-killing robot. San Francisco is going so far as to consider a tax on robots that replace humans. Not helping matters is flashy robots, like Boston Dynamics’ Atlas humanoid, that appear to be more advanced than they actually are. Atlas can do backflips now, true enough, but robots—and bipeds in particular—are still primitive. For proof, watch them fall on their faces pretty much constantly at the Darpa Robotics Challenge.

Sure, as robots get more sophisticated, they’ll threaten to replace some human labour outright. That’s the nature of automation. But we won’t wake up tomorrow amid a robot singularity. The shift will happen more gradually, and even then, not catastrophically. All the while, and probably far, far into the future, we’ll be collaborating with the machines, exploiting their strengths while celebrating our own, which hopefully doesn’t include our ability to double as batteries – Matt Simon