Thursday, February 27, 2014

Embedded World - First impressions

Once again, it's the time of the Embedded World in Nuremberg Germany, the biggest and most important event of it's kind in Europe. Here are some of my first findings from the fair.

Internet of Things (IoT) is the big theme of the event this year, and in the embedded industry in general. Wireless connectivity is very important part of the IoT scheme. There are some clear changes ongoing. Rising technologies are Wifi, Bluetooth Smart (the new official name of Bluetooth Low Energy), and Sub-GHz RF solutions. Other 2.4GHz radios, including Zigbee and classic Bluetooth are loosing market share.

ARM, Linux and Qt seems to be the winning combo. In the past, x86 architecture dominated the embedded market. Now clearly all new design are based on ARM architecture. x86 products only exists because they we're designed in the past and are still in production, not yet dumped altogether. In an earlier posting, I discussed about "Intel - the walking dead". Intel had big presence at the trade fair, but only place where I found any mention about the new Quark CPU was at Intel's own booth. No one else seems to be interested in it. Very few companies planning anything new with Atom either.

The same case in with operating systems. Embedded Windows seems to disappeared from the map altogether. All the demos I familiarized myself where either based on Linux or QNX, or in some occasions some other RTOSs as well. But not a single Windows case, except perhaps at Microsoft department, but I didn't even bothered to visit there.

Enea Linux and distributed database demonstrated with RaspberryPi and BeagleBoneBlack.
As an example, software technology company Enea originating from Sweden, just recently released an open Enea Linux distribution, based on the Yocto project. Enea has been known for it's OSE and OSEck operating system widely used in network processors and DSPs in communication network elements, and clear market leader on that segment.

On graphics side, the game is not that clear. Qt provided by Digia has strong presence on the market, but there is still space for other solutions as well. Especially in 3D graphics for gaming and demanding visualization. Qt does offer solutions for that as well, but it is still most known for it's 2D user interface designs. HTML5 is growing as a challenger in local embedded UI displays as well, as the computational power of embedded solutions is increasing, thus Qt will have challenges to keep it's market share. I had long conversation with Digia people, and there a quite a lot of interesting things to come in 5.3 and 5.4 releases, not yet available.

I'm happy to see the choices made by my company a decade ago were the right ones: engage to ARM and Linux technologies. At that time, Qt was not widely available, but already before it was published as an open source, we started to co-operate with Trolltech, the company which originally created the technology. ARM, Linux and Qt, here we come!

More findings from Embedded World to come in next postings, stay tuned.

Saturday, February 15, 2014

Imp - perfect?

Easy add-on IoT and WSN from Electric Imp.

When I first heard of Imp, it was almost too good to be true; MCU module with Wifi in small form factor of SD card, readily available cloud connectivity and easy mean for configuration of wireless interface. Immediately I ordered some samples and here are my first experiences. There are great hookup instructions at Sparkfun pages, thus I don't need to copy that here.

The concept is great especially for products were IoT connectivity is intended as an add-on option. The BOM price of the base product can be kept as low as possible. Only SD-card holder and an identification chip from Atmel is needed, which increases costs by ~1€. The form of SD-card makes it very easy to retrofit the module in to the product, even by the end-user if necessary.

Imp concept illustration (From ElectricImp web site).

Imp module communicates via local WiFi and internet infrastructure with Imp cloud. The cloud acts as a proxy in between end device like mobile terminal and the Imp'd product. When developing a product, one must make software to three places: into the Imp module itself, into to the cloud to define what to do with the data, and to the end device to describe how to present the data (HTML page). Even if developer can put some code to the cloud, hosting web pages for end device is not supported, according to my understanding. Thus an external web server is needed, or web page must be stored locally in the end device.

The company has invented a clever way to configure Wifi network with help of mobile phone. With an App available for iOS and Android, user can transmit Wifi network configuration optically to the Imp module. In practice, the App does blink the screen of the mobile phone, which is then recognized by the optical sensor of the Imp module. Sort of modern Morse coding. The mobile phone in the left bottom corner of the picture illustrates that function. It does not mean there is possibility to have direct local connectivity in between the phone and Imp node without cloud.

Imp module opened, and breakout card holder from Sparkfun.
Hardware specifications of the module are impressive.In the tiny form factor there are both ARM Cortex-M3 MCU and Wifi embedded.  The SoC controller is STM32F205RG6 from ST Microelectronics with 128KB RAM and 1024 KB Flash, running at 120 MHz. The Wifi chip is Broadcom BCM43362 SiP, supporting 2.4 GHz 802.11b/g/n.

Actually, the hardware configuration looks pretty similar to Murata SN8200. I could imagine they are both derived from the same reference design. Imp is available also as a solder down module, which reminds even more Murata's one.

Electrical interface of the SD card provides very limited number of signals. On the other hand one can configure them freely, which makes the module suitable for many purposes. Among the 6 I/O lines available, one can have up to 3x UART, 2x I2C, 2xSPI, 6x ADC, 6x PWM, and of course 6x GPIO. Each pin can source up to 4mA current. For further details, take a look at the Imp pin mux table.

Unlike Murata, the Imp module has closed firmware. Only supported mean of programming is the Imp Cloud IDE, with somewhat unfamiliar scripting language called Squirrel. The open source project defines itself as follows: "Squirrel is a high level imperative, object-oriented programming language, designed to be a light-weight scripting language that fits in the size, memory bandwidth, and real-time requirements of applications like video games." The language itself looks like mixture of C, C++, Javaa, Python, Lua, etc. People familiar with those languages should learn it quickly.

To summarize some features:

 Pros:
+ Easy to set up
+ Cloud connectivity provided
+ Easy programming environment and target deployment

Cons:
- Only cloud connectivity provided, no support for local connectivity
- Closed firmware, only cloud IDE and Squirrel language supported
- Limited electrical connectivity
- Specific ID chip from Atmel is required


Due to the Cloud-only approach, the module is risky choice for any commercial use. It's hard to rely on third party service on which you have no control over at all. System provided by you is only usable as long as the company stays in business and is willing to provide the cloud proxy free of charge.

I do not quite understand why the company has selected this closed approach. Perhaps they are planning to start charging for connections some day, once the critical mass has been exceeded. There is no technical reason for the choice, as the reference design of the module provides extensive software support, including Wifi drivers, TCP/IP stack, HTTP/DHCP/DNS servers, DHCP client, and more.

I hope the company would open up the firmware to the community, so that people can bring in the features they need for projects or products. I'm confident they would get better acceptance and ecosystem that way, and eventually more sales. Or perhaps perhaps people will manage to create an alternative firmware for the Imp by themselves... It's a fine product, and I don't wish a bad software strategy will destroy it.

Thursday, February 13, 2014

HIL toolchain integration strategy

How to select an integration strategy for a HIL testbench?

In the context of embedded systems, there are two major domains: software design and electronics design. In both domains there already exists extensive set of tools and toolchains to meet every purpose within that domain. Integration of these two domains have been covered in my previous postings. Now the question is, how to select which tools to integrate and how, what is the strategy?

In electronics design and testing, most tools are graphical; schematic editors, layout editors, circuit simulators, test cases and sequencing, etc. Whereas in software development all tools are more or less text based, even inside graphical IDE: code editors, compilers, debuggers, unit test tools, integration test tools, etc. It's all about defining in textual format what is supposed to happen.

Graphical expressions are often intuitive and easy to understand. With simple tasks the productivity tends to be good. However, the expressive power of graphical presentations is more limited compared to textual expressions. With graphical tools one can only do such things where graphical elements exists. Combination, recurrence, abstraction, etc. are far more difficult with graphical presentations.

Graphical tools tends to be vendor specific, cross-combining graphical presentations from different vendors is next to impossible. I do not mean for example schematic editors exchanging schematic diagrams, but more like test tool automatically understanding your schematics diagrams.  In software development domain, mix and match different tools is business as usual and daily habit as such.

Due to that, I claim that in cross-domain HIL toolchain, it's better to select most of the tools from software side, and only use hardware specific tools where it is absolutely necessary, in interface to hardware instrumentation. Product specific test cases, test sequences, simulation models, etc. are all better to be done with tools from software domain.

Block diagram of possible development and testing toolchain.






Also the abstraction level of integration needs to be decided. Alternatives vary from very low-level primitives like measure instrument X channel Y to very high level commands like perform sweep-test function for the power supply under development. Clearly this selection affects the communication between tools, and the place where majority of test case logic is implemented.

My intuition says you should put in hardware side routines which are generic and not specific to any certain product. All product dependent test executions would be better to be collected in one place, for easy of maintenance, manageability and understandability.

Tuesday, January 28, 2014

HIL in software test automation toolchain

Introducing HIL in software testing.

Typical toolchain for Test-Driven Development (TDD) in software project may consist of continuous integration framework, static code analysis tool, unit test tool, and possibly acceptance test tool. In case of OSS tools in use, the chain may consists of Jenkins, cpptest, ccpunit, and Robot framework, as an example among many. That's a good toolchain for pure software development, but in case of embedded systems, it's missing the HIL aspect.

Robot Framework project defines itself as a generic test automation framework for acceptance testing and acceptance test-driven development (ATDD). Robot has gained popularity recently within my organization. The framework is written in Python, which seems to be popular language among many of our developers.

Robot is well suitable for testing web user interfaces, communication protocols, database operations, and all such cases when the behavior of the software under testing can be monitored via external interfaces of some sort, without introducing any test instrumentation into the software itself. In acceptance testing, the intention is to test the production software the way it is supposed to be used in real use cases, and that's why such an instrumentation like in case of unit testing  is not acceptable.

Robot is keyword-driven, which enables different abstractions levels, as one keyword can be defined from other keywords. In addition to Standard Test Libraries,  there are extensive set of community contributed External Test Libraries for many different purposes.When implementing test cases, one can easily combine keywords from different libraries, without need to do any actual coding. A graphical Robot IDE (RIDE) is available to make it easy to write and maintain test case sets.

Robot supports Remote Server concept similar to gdb-server.With help of Remote Server, Robot Framework may communicate with test libraries located in a different machine, or written with different language. It enables distributed testing as well. Remote server uses XML-RPC over HTTP to communicate between the Framework and Server.

Robot Framework Remote Server architecture.
Readily available Remote Servers exists for Python, Ruby, Java, .NET, Perl and more. As the XML-RPC is well documented, one can easily implement servers for other languages as well. There even exists a lightweight XML-RPC server implementation for Arduino.

Robot Remote Server concept sounds like the solution to integrate software testing with other domains, especially hardware testing, as discusses in my previous posting. National Instruments LabView is possibly the most popular solution for hardware testing. Implementing XML-RPC Server in LabView and integrating it with Test Library in LabView makes it possible to communicate with hardware test routines from software test cases.

Disclaimer:
Personally I'm not a big fan of XML. XML-RPC consumes 4 times more characters compared to plain XML, which by itself is a bloat compared to JSON for example. XML-RPC is fixed to be transferred over HTTP, which is not the most efficient method for two-way communication. JSON is not tied to any specific carrier protocol, WebSocket is possibly the most common and obvious choice. WebSocket by itself is more efficient for back and forth data transfer than HTTP. I hope one day someone will introduce JSON/WebSocket implementation to substitute XML-RPC.

Thursday, January 23, 2014

Software development with Hardware In the Loop

Hardware In the Loop (HIL) Simulation has been used in development of complex embedded systems. However, it does not need to be limited to complex systems only but development of simple embedded systems may benefit from the approach as well.

In traditional HIL Simulation, the physical system "plant" is modeled mathematically "plant model". The system under development (SUT) then communicates with the model instead of the real physical system, which may be expensive, dangerous or hard to reach during development time.

The challenge in HIL simulation is the effort and time required (costs) to create and maintain a model that simulates the real world physical process with any degree of accuracy. Because of that, HIL has traditionally been limited to most complex and safety critical systems only, like automotive and aviation domains.

HIL is not only for verification on complete system, but it can be utilized already in early phase of the hardware/software co-design process. In pure software development, methods like unit testing and integration testing have been in use for long. Why not to introduce Hardware In the Loop already in the unit test phase of the embedded software development? Then the whole plant model does not need to exists, only tiny parts of it, and over the time the model gets more and more complete piece by piece.

The Wikipedia says: "An HIL simulation must include electrical emulation of sensors and actuators. These electrical emulations act as the interface between the plant simulation and the embedded system under test."  
HIL Interfacing
In the illustration above, I'm referring to sensors as Stimulus, and actuators as Response. I prefer those expression, as we may introduce HIL already when the target hardware does not exists. Then we not only simulate the plant model outside of the SUT via sensors and actuators, but also simulate the missing hardware subsystems inside the SUT. Software development and HIL testing can begin with plain CPU/MCU evaluation board only, and there is no need to wait for the first prototype of the target hardware to exists.

In order to execute software unit and integration testing in the target hardware or evaluation board, some sort of communication and software instrumentation is needed in the target to enable execution of individual functions upon request and collect feedback. Then we have three feedback cycles as illustrated:
  1. Provide physical stimulus and inspect what actions it causes in the software
  2. Execute software function and inspect the physical response
  3. Let the software run and inspected what kind of physical response is cause by physical stimulus
Now we have full control over the whole process, including physics, electronics, and software phenomena. My suggestion is to start with baby steps, by introducing HIL at earliest phase possible in software development. Then we start gaining savings in expenses, reduced project risks and improved safety, especially by expanding the test coverage.

HIL is essential part of embedded systems regression test automation, and now we're talking about the money!

Edit Jan 28th:
Clarification: What I mean by "talking about the money". I mean that manual regression testing is awfully expensive, and that's the motivation for automated regression testing. 

Tuesday, January 14, 2014

Intel - the walking dead

Referring to Intel's latest Edison release at CES, I mentioned "Perhaps now they have invented something where they can be good at.". After investigating the Quark architecture more, I do not think the same anymore.

The Quark processor is based in Pentium architecture, as Intel states. This means it's eventually based on 32-bit x86 architecture, which dates 40 years back. Why an earth Intel selected Pentium instead Atom as the starting point for Quark? I can only conclude that Intel has reached dead-end in it's effort to decreasing energy consumption of Atom architecture.

Even if Atom cores are advertized energy efficient, the overall energy comsumption of the whole CPU environment including chipset and others is way above acceptable limit of embedded systems. With Quark, Intel advertises they have managed to drop the energy consumption down to 1/10 of Atom cores. Such a tenfold reduction of Atom energy consumption is a can not task. So, something else is needed, and the answer is reversing back to Pentium architecture.

Well, let's rewind the clock back for 20 years. With Pentium architecture you loose all the processor development Intel has done for the last two decades. No 64-bit registers, no out-of-order execution, no MMX, no SIMD, and more. All this means that you can not run latest Microsoft software in the CPU. Time to seek for the floppies to find DOS and Windows 3.1... Linux runs, and that's the only supported OS at the moment.

The concept of integrating the whole CPU environment with connectivity into a SD-card form factor is clever. I can immediately imagine several use-cases for the approach: a system having bare bone functionality implemented with a tiny MCU, and if connectivity and user interface is needed, that can be provided in physical form of SD-card add-on module. You get them both, minimal cost base system, and state-of-the-art Internet of Things support.

The Quark is just a Intel's last effort to struggle against non-existence at embedded market. I'm not convinced, and I hope someone will soon introduce similar concept based on ARM architecture.

Edit Jan 18th:
Intel confirmed it will cut 5000 jobs, which is 5% of its global workforce, due to falling computer sales. As Intel fails to enter into new market segments, simultaneously loosing sales in traditional markets, the company can't stand for long. Google is said to developed it's own processor for data center use, which may indicate Intel loosing that market as well.

IBM and NVidia have released joint effort in creating more energy efficient computing solution for data centers for enterprise and scientific use. At the moment, Intel+NVidia are ruling the Top-10 of Green 500 list, but that may change soon, if Intel doesn't improve. As Intel fails to improve energy efficiency of its embedded CPUs, I'm not optimistic they manage to do so in supercomputers either.

http://www.reuters.com/article/2014/01/17/us-intel-jobs-idUSBREA0G1I420140117

Guido Stepken discusses also about Intel power efficiency in his posting The irrelevance of Microsoft/Intel vs Linux/ARM

Edit Feb 11th:
My dreams have come true! Electric Imp has released an ARM CPU based Wifi-enabled embedded computer in SD-card form factor! It's available at Sparkfun and other distributors. More information and instructions at Sparkfun site:https://learn.sparkfun.com/tutorials/electric-imp-breakout-hookup-guide/all



Thanks to Jan TÃ¥ngring @ ElektronikTidningen for the hint. Here is the original article in ETN: http://www.etn.se/58507 (in Swedish).

AdaFruit has also published an article http://www.adafruit.com/blog/2012/12/06/new-product-electric-imp/

Change in embedded 3D graphics market

Nvidia released it's latest offering in Tegra family at CES fair last week. The K1 CPU will be a game changer in consumer electronics. What makes it so special? At the moment there are only two companies having high-end 3D graphics acceleration assets. Those two companies - AMD and NVidia - are dominating the PC graphics card market at the moment.

What differentiates those two companies from each others is the fact that AMD does not have reasonable offering in embedded CPU sector. AMD is trying to enter into the embedded market with their x86 architecure G-series SOCs, but they are not yet quite there. AMD provides DX11 interface, but they are missing the CUDA architecture. The major benefit of AMD x86 architecture is the fact that it can run full featured Windows operating system. But hey, where are the Windows phones and tables, anyone seen? (*)

The K1 offers all the same interfaces for graphics acceleration that full featured PCs and consoles do, including OpenGL 4.4, DX11 and CUDE 6.0. That means PC/console games should be rather straightforward to port to the new CPU platform. Console level gaming in tablets is just behind the corner. And as the K1 is based on ARM architecture, Android is just a plug and play exercise for device maker.

Currently Qualcomm is dominating the high-end tablet and smart phone market, with almost 30% operational margin. In graphics, Qualcomm does not have such assets, and they are stuck on OpenGL ES level. In Snapdragon processors Qualcom has high performance graphics acceleration integrated, but as it has different graphics architecture which makes porting difficult, I do not expect them to win the high-end gaming market. If you haven't sold your Qualcomm shares yet, perhaps it's a good time to do it now.

Intel is next to non-existing player in the embedded market. I have never really understood what is the justification of existence of Atom processors. They are not good at anything. Now at CES Intel introduced a SD-card sized computer with Quark processor. The concept is called Edison. That's interesting, but definitely aimed at something else that high-performance graphics, wearable computing as Intel says. Perhaps now they have invented something where they can be good at.

At the moment, NVidia does not have a proper modem offering. If they ever will introduce one, or do co-operation with Broadcom, they will be very strong assets for smart phones as well. Broadcom has very strong position in connectivity chipsets at the moment, and as they recently purchased Renessas mobile (ex. Nokia modem division), they do have reasonable LTE-modem technology in hand.

Nvidia is strongly pushing itself into automotive market. Audi, BMW and Tesla are currently using Tegra technology in latest models. At CES, Audi demonstrated a virtual instrument cluster running in K1. The 3D software technology in use comes from Rightware, which provides Kanzi 3D solution with runtime rendering engine and UI development environment.

Rightware has it's headquarter just next to ours in Espoo, and I already have a Kanzi 3D demonstration setup on my desk. In my demo, Kanzi is running in Freescale quad-core i.MX6 CPU with Android. The i.MX6 is a nice chip, but it's limited to OpenGL ES graphics acceleration. That's definitely good enough for 3D visualization in working machines and vechicle environment, but it's not powerful enough for high-end gaming with high 3D graphics processing requirements. Actually, we have recently designed a professional vehicle graphical control panel with i.MX6 CPU that can benefit from 3D visualization technology.



*) In the local elementary school where my daughter is attending her fifth class, they are investing in new technology in teaching. Recently the school purchased tablets for all the students. Today at home my daughter reported that for a certain classroom assignment, half of the class got iPads, and rest of the class got Surfaces. Those lucky ones who got the iPads got the assignment done in half the time compared to the other ones with Surfaces.

Some further readings:
Rightware press release: The New Audi TT Instrument Cluster Created with Rightware Kanzi UI Solution
The Motley Fool: NVidias Tegra K1 Completely changes mobile gaming
CNet: NVidia K1 chip sees the open road
The Verge: Intel announces Edison, a computer the size of an SD card