Quantcast
Channel: XMOS blogs
Viewing all 78 articles
Browse latest View live

Programming multicore: the right way?

$
0
0

Dr Dave Lacey, Technical Director of Software ToolsLast week I attended the NMI (National Microelectronics Institute) event on Multicore Processors and Programming. The event was focused on how the embedded industry will adapt to the new wave of multicore processors.

One thing that struck me at the event was the range of systems that come under the banner "embedded". Everything from 8-bit micro-controllers to large SOCs running a version of Linux or Android are used in embedded systems. In all these areas, multicore is here and being used today.

At the larger end of the scale, you have chips running a full blown OS with large memory, dedicated graphics processors, many peripherals etc.

These are the same kinds of chips that are used in modern smart-phones. These generally exploit multicore via a single OS that decides at run-time which processes to run on which core (the symmetric multiprocessing approach). This route seems a natural one that builds on the current paradigm used for these kinds of system.

That is not to say it is without is perils however. As a good talk from Feabhas' Niall Cooling showed, programmers still need to know something about the system to avoid subtle race conditions in their code. There is also the lingering question of how to ensure real-time constraints in your program.

However, can we only exploit multicore with large, resource rich systems capable of running a large SMP-enabled OS? The answer is clearly no. We need to exploit multicore in deeply embedded systems where power and price budgets lead to natural resource constraints and a need for a large level of efficiency. The talk I gave at the event was on one way to program multicore for these kinds of systems.


XMOS at eclipseCON 2014

$
0
0

Matt Fyles

Matt FylesKris Jacobs and I recently attended the worldwide eclipse developer conference, a four-day event in San Francisco
https://www.eclipsecon.org/na2014/

eclipse is one of the key foundation platforms for xTIMEcomposer studio and helps us to deliver industry leading tools for our multicore microntrollers.

The conference itself was very well attended with the key themes being the Internet of Things, eclipse for embedded platforms and how the eclipse platform is evolving to support the new features of Java. It was great from an XMOS perspective to meet some of the developers responsible for the base platform we use and discuss plans for the future.

The industry and in particular open source tools are still moving slowly to support multicore, particularily in the areas of debug and trace and these are things which at XMOS we are constantly pushing forwards to help our growing user base. Of particular note was the presentation about multicore visualization which demonstrates how debugging has to change in order to scale to larger systems https://www.eclipsecon.org/na2014/session/cdt-and-parallella-multicore-debugging-masses.

A personal favourite presentation was one from NASA about how they use eclipse to develop the PC applications for their intelligent robotics group. eclipse in space seemed to be very popular with everyone in attendance; hopefully they will have a chance to play with the startKIT I left them in their robotics research https://www.eclipsecon.org/na2014/session/nasa-verve-interactive-3d-visualization-within-eclipse

AVB can be more than AV

$
0
0

Andy GothardThe recent AVnu Alliance face-to-face meeting in Minneapolis was revealing in more ways than one. For a start, the organization, which is the certification and standards definition body for Ethernet AVB, is about to get some major new member companies, including some household names. All will be revealed soon!
Meanwhile, conformance and Interoperability (C&I) testing itself is now in full swing at the Alliance’s appointed testing house, the University of New Hampshire InterOperability Laboratory (UNH-IOL).  The first series of switches is already certified, and a number of endpoints are going through the rigorous C&I process.
Work is also progressing on vertical market requirements, including automotive.
All of this illustrates the growing traction AVB has in the market. Just as importantly, it is becoming clear that the AVB suite of standards can have an impact that reaches far beyond the audio-visual applications for which it was originally envisioned.
Ethernet is the world’s favourite networking technology, but its traditional implementation is a poor fit for time-sensitive applications. In fields from AV to industrial measurement and control, factory automation and robotics, the practical result has been a diversity of proprietary networking technologies that don’t interoperate and don’t achieve economies of scale.
Standardization and certification can fix those problems:  the current AVB standards have the capabilities to form a big part of the foundational solution for time-sensitive networking (TSN) using Ethernet. Any application that requires synchronization, low latency, redundancy and determinism, while simultaneously accommodating traditional network traffic, will benefit from this next stage of standardization.
As designers seek to inject more intelligence into their embedded designs, it becomes increasingly important that computing power be intimately and predictably connected to the outside world, and to other embedded devices. The whole TSN approach is certain to have a major impact in this new environment:  it turns out that AVB could be about a lot more than ‘just’ AV.

Machine learning: “biggest thing since the invention of the digital computer”

$
0
0

Mike Furnival, XMOS VP S&MXMOS at Future World Symposium – 29-30th April 2014, Twickenham Stadium

XMOS recently attended the Future World Symposium (FWS) held at Twickenham Stadium in the UK. This two-day event, hosted by the UK’s National Microelectronics Institute (NMI) and held every two years, was attended by some 190+ delegates and brought together some of the leading names and voices in the world of electronic embedded systems and software.
This year’s theme was about discussing and trying to understand what the Internet of Things (IoT) really means for our future world in terms of connectivity, content, capability and security. Imagine today’s 7Bn connected devices growing to over 50Bn by 2020 and what this could mean for our future. As a result the event attracted some excellent speakers from, amongst others, ARM, Cisco, Spotify, Intel, Freescale, NEC and of course XMOS. In addition to the talks and panel discussion, companies were able to display products and services on small stands in Twickenham’s Rose Room.
Simon Knowles, XMOS’ recently appointed CTO, gave a characteristically upbeat and entertaining talk on his favorite topic, ‘Machine Learning’. Simon certainly got the audience excited about the prospect of tomorrow’s machines not just being programmed to perform tasks but also being trained to learn and adapt to their experiences and environment. Products that may benefit from machine learning include visualization systems, advanced toys and of course, all kinds of robotics systems…… It was very much a‘that film about robots ruling the earth’ kind of experience……
Simon summed up his talk by contending that machine learning will require two fundamental advances. The first will be much higher levels of compute that will only be delivered by future generations of parallel processors. The second will be software capability that will almost certainly spell the end of C or C++ as the embedded systems programming language of choice. His closing statement that ‘machine learning will be the biggest business opportunity since the invention of the digital computer' was an eyebrow-raising and fitting end to a well delivered speech.

Exploring the possibilities of the xCORE startKIT

$
0
0

David WilsonTowards the end of last year we gave away several thousand of our startKIT development kits to designers who had expressed interest in exploring the potential of XMOS technology.

We’ve seen a wealth of interesting write-ups and projects from the community. Here are a couple that stood out for us: Mark Graybill posted one of the first ‘how to’ guides following the giveaway, (‘Parallel Processes on the XMOS startKIT’), on his ‘infinite improbability’ blog. Mark grapples enthusiastically with the possibilities of startKIT, including posting code for his ‘LED blinker’ for others to use. As an early adopter Mark was also really helpful in spotting some inconsistencies in the startKIT documentation: all useful stuff!

Another fantastic contributor who helped us with suggestions on the documentation and support was 'iraqigeek', who not only blogged about his early experiences, but also contributed massively via the XCore Exhange Community.

At about the same time, following the Christmas ‘lull’, Andrew Back published his own guide on Designspark; ‘Getting started with startKIT’. Andrew referred to xCORE as ‘an incredibly flexible microcontroller’. He describes the key features of the various components and development tools, as well as providing example code and video. This blog is well worth a look for anyone getting started with startKIT.

In amongst a number of practical applications posted on the forums, Jason Lopez (@atomsoft), released a couple of very interesting projects on his own blog across January and February. These related to Jason’s attempts to build a cost-effective home automation system, and to run a TFT (thin film transistor) screen on his startKIT. See his Home Automation Test 1 and 2.8inch TFT blog posts).

More recently (May 2014) forum-user shabaz posted a very comprehensive guide to the XMOS startKIT in element14’s ‘Internet of Things’ community group. This guide not only covered the terminology and architecture behind xCORE technology, but also gave a quick yet detailed guide to getting started, and a few example programs. Shabaz found XMOS technology to be 'fantastic at handling timed events and multitasking’, that ‘the development environment is fairly straightforward to use’, and that 'if you want to handle input/output at a relatively high speed with very high accuracy and ease-of-use, XMOS devices are very high up on the list of suitable devices’.

You can see more interesting startKIT projects by visiting the XCore Exchange Community– and there are quite a few more in the works right now, which we’ll mention in due time on our Facebook and Twitter feeds as they occur. If you’re doing something interesting with your free startKIT, or have a proposal for an interesting project you’d like to produce using XMOS technology, why not get in touch with us via the same channels? Let’s see what’s possible...

Using the xCORE Analog sliceKIT to power a real time 6DoF platform

$
0
0

In April we ran a competition for members of our XCore Community– with the prize being a full xCORE Analog sliceKIT. We asked entrants to share a 10-word project idea, for which xCORE technology may be of value.

Having announced the competition winner, Thanos Kontogiannis, we caught up with him to talk about his 6DOF motion platform and find out how he plans to use xCORE technology to add another dimension.

Q: What is the 6DOF motion platform and how long have you been working on it?

The 6DOF motion platform is basically a parallel manipulation robot actuator, which is used to manipulate the position of its top plate to various angles, distances or rotations – this is how you achieve the six degrees of freedom effect. I’ve been working on my 6DOF controller since 2008 and completed the project in early 2013. You can see a video of the platform in action here.

Q: How will xCORE technology allow you to further develop the platform?

To position the six actuators correctly, complex mathematics is required to calculate the inverse kinematics required for correct motion cues. The math model runs in real time on a computer and sends the calculated positions to the motion controller, which process the Proportional Integral Derivative (PID) math required for the motor positioning.

Using xCORE technology, I can embed the complicated math calculations inside the motion controllers to make the 6DoF platform respond in real time and perform tasks more autonomously. By dedicating two cores to each motor, the heavy calculations required for more seamless synchronisation and high position feedback accuracy can be done without the computer.

Currently, due to the involvement of the computer, the transmission delays are around 1ms. Furthermore, using the LAN interface, I can transmit the raw six degrees of freedom to the platform in data packets. Removing the computer dependency for the calculation process broadens the variety of applications that the platform can be used for.

Q: What other applications could an xCORE-powered platform be used for?

Traditionally, 6DOF platforms have been used for flight simulation and more recently driving simulation technology. However, breaking away from the need to use a computer to conduct the necessary calculations opens the door for other applications such as the real time first-person view flying of an unmanned aerial vehicle (UAV) with motion feedback telemetry. Or alternatively, watching a live car race with a cockpit view that uses a live video feed and real time telemetry to reproduce the conditions in the car for a viewer on a 6DOF platform.This combined with a virtual reality headset such as the Oculus Rift could be the ultimate ‘as close to reality’ experience.

Moving away from motion simulators, xCORE technology can be used in many time-critical robotics applications. For example, handling sensors that could enable an autonomous vehicle to drive around using GPS, a real time camera and object processing. The multicore system allows for fail safety, using other cores as backup of triage, minimising the possibility of errors. Perhaps that could be my next project…

*****

You can follow Thanos’ progress on his personal blog.

Childhood dreams not far from reality. What I learnt at FWS

$
0
0
 

KITT Last month I attended the Future World Symposium. This event is a two-day foray into the near future, looking at products and projects that we may soon be interacting with on a daily basis, and covering such topics as the Internet of Things (IoT) and machine intelligence.

Here in the UK there has been a great deal of interest in the IoT, but my main takeaway from the FWS event was not the expected proliferation of connected devices throughout the world, which although impressive (50bn expected by the year 2020), will be largely just a data and efficiency boon for various industries and yield little to the consumer.

What seems far more interesting, for me, is how these devices will work intelligently.

Many of us have used a voice recognition system like Apple’s Siri, and while this does indeed exhibit a reasonable amount of ‘intelligence’, I was surprised to learn that it takes 300 server class CPUs to do its job (with each request you make). Block Siri from the cloud, and it exhibits just how little it can do.

For IoT to succeed and show real value to a consumer, it cannot just be a data gathering tool or rely on always-on Internet connections in order to function. Instead, the intelligence needs to be embedded into each IoT device. This will be a range of levels of sophistication; the intelligence required in a toaster will obviously be very different to that of an unmanned car (Talky Toaster aside).

With this capability, a product can start to regulate itself, make itself more efficient, learn your habits and work with/around them (depending on application), clarify requests made of it or make alternative suggestions.

From what I learned at FWS, this sort of interaction with machines is not that far off. Siri shows that speech recognition is well advanced already. The main barrier appears to be making the intelligent technology small enough to be embedded in devices rather than relying on the cloud.

So it looks like we wont be getting Hover Boards any time soon, but perhaps we will all be driving intelligent cars like K.I.T.T sooner rather than later.

If you would like to learn more about the topics covered at the Future World Symposium, the event presentations can be viewed here.

Exciting Times! - Synapticon and XMOS

$
0
0

XMOS and Synapticon have been working together now for almost three years. During that time, I’ve been impressed with the level of innovation that Synapticon have brought to the field of cyber-physical systems – advanced motion control combined with a revolutionary way of composing designs virtually using the OBLAC system. At the same time, they’ve used the capabilities of the xCORE architecture to deliver class-leading performance and reduce both the cost and time-to-market for these systems. So naturally, when we began to discuss deepening our collaboration, I was very excited at the prospect of helping to bring these two innovative companies together.

In the future imagined by initiatives such as Industry 4.0, production systems will be much “smarter” – distributed, flexible, intelligent nodes, configured by software tools designed to optimize and control the manufacturing process autonomously. Such a bold view requires a seismic shift in the underlying compute and communications architectures coupled with a new programming or training paradigm for these production cells. This view of the future is one of the reasons that the synergies between the two organizations resonate so strongly.

The xCORE architecture combines flexible I/O with true multicore compute capability that can be readily accessed without the need to resort to complex hardware description languages. The SOMANET hardware and software modules exploit both the underlying performance of the xCORE architecture and the I/O capability accessed through simple software calls to create powerful, flexible control and communications nodes, which themselves can be composed using the web-based OBLAC environment enabling the virtual prototyping of cyber-physical systems. This unique combination of technologies offers the intriguing possibility of flexible, low-cost motor and motion control capability packaged to enable rapid time-to-market.

In the same way that XMOS is committed to provide open, programmable multicore hardware platforms, Synapticon is on a mission to revolutionize the delivery of cyber-physical systems. SOMANET provides the infrastructure upon which Synapticon engineers can implement the intelligent nodes mandated by the Industry 4.0 vision, in the same way that xCORE provides the infrastructure underlying SOMANET. One of the most attractive features of our partnership is the passion that Synapticon brings to implementing that vision – and they can bring this passion to your products through their consulting services too!

Perhaps the most exciting aspect of the partnership is that we are already working together on next-generation technologies to further accelerate the adoption of the Industry 4.0 vision. We’re just rolling out the first fruits of our collaboration, but already both companies are working on products which we are sure will be at the heart of smart manufacturing. Together we’re looking forward to revolutionizing this space – exciting times indeed!


Join the growing XCore Exchange Community

$
0
0

XMOS has, since its inception, enjoyed engaging with its following of engineers and developers on a one-to-one level via the XCore Exchange Community.

 
The XCore Exchange Community has been growing at a fantastic rate in 2014 with over 500 new members joining since January!
 
In November we announced the newly launched Q+A section of the site, and this year we’ve been trying hard to keep on delivering improvements, to make the community experience even more rewarding.
 
We’ve had our first competition of the year, giving away a FREE Analog sliceKIT worth almost $200 to TronicGR, who is putting it to good use improving his 6DoF platform, and followed this up with a promotional price offering the Analog sliceKIT to community members for only $110 in June.
 
Moving forward, due to the excellent feedback from both of these activities, we will be following them up with further regular competitions and XMOS giveaways.
 
We are also revamping our Activity Points system to make it easier to reward people for their contributions to the forum. We’re planning a range of rewards, from an XMOS mug to reference design kits!
 
Head on over to www.xcore.com and join the growing community of xCORE developers and engineers.

 

Motion control: from toys to advanced robotics

$
0
0

We recently announced a new family of Motor & Motion control development kits in partnership with Synapticon, and I thought that it would be a good opportunity to consider how these kits will help customers across all segments of the market.
First let’s take a brief tour of the wider Motion Control market and applications.
Motion Control is a broad term which encompasses any machine or system in which movement must be controlled and by implication monitored. This ‘control of movement’ could be as simple as controlling the speed or velocity of a single component (for example the spin speed of a motor or the rate of travel of a conveyer belt), or as complex as controlling and synchronizing all the electrical and mechanical sub-systems and components used in advanced robotics applications (such as Human Assist Robots or autonomous vehicles). With such a broad definition it is no surprise that motion control systems are used in all areas of life, from the factory to the office and the home, and encompass applications as diverse as:
·      Autonomous systems – unmanned aerial vehicles, driverless transport, remote operated vehicles
·      Factory automation – assembly, handling, manufacturing, inspection equipment
·      Robotics – industrial, , service, domestic, toys
·      Smart logistics – warehouse automation, unmanned delivery systems
 
Perhaps unsurprisingly, this broad range of products and applications means that there’s an equally diverse range of companies and individuals active in the field. As we talked to potential customers and worked to define our new motion control offerings, we identified at least four distinct categories, each with its own characteristics and requirements.
 
Leaders of the pack

First come an elite group of industry-recognized domain experts; global players with a long history in motion control, a significant market presence and a rich fund of intellectual property in the field. At XMOS, we coined the phrase ‘Leaders of the pack’ to describe these global players.
 
Working in an organization like this, you need to offer a comprehensive product portfolio and world-class customer support. Your customers look to you as a reliable and trustworthy brand in the market, providing high quality, robust solutions.
 
Our new Motor & Motion Control kit provides an application specific platform that allows domain experts to evaluate the xCORE architecture and the advantages it can bring in motion control applications.
 
Product innovators

The second group we identified are the ‘Product innovators’. If you’re an SME delivering differentiated products to a sub-segment of the market, you are likely to fall into this category. With a strong position in your focus sector, you probably have expertise in one or more core areas: but when necessary you augment that with acquired technology to meet the demands of the marketplace. Innovation is key to success and product differentiation is at the heart of your business planning.
 
For this sector we’ve focused on delivering an easy-to-use hardware and software platform that allows the creation of new classes of unique products in the areas of control and automation. We offer complete kits supported by open source software for creating powerful motion control systems; however, unlike other motion controllers our platform is freely programmable and has significant processing power available to allow developers to add custom functions and applications. Where necessary, our partner Synapticon can help solve the challenges of realizing your vision.
 
System integrators

The third category of customer who we think can benefit from xCORE and SOMANET technology is the ‘System integrator’. The goal here is to deliver standards-compliant products quickly and effectively, across a broad range of applications and sectors. If this is your focus, you need access to proven, robust technologies and platforms which can be leveraged across a number of end products and enable accelerated development cycles.
 
Our new motion control solutions are based on Synapticon’s market proven SOMANET technology and provide access to a flexible motion controller for efficient, standards-compliant and safe motion control applications. Our compact, modular kits and supporting software enable any mechatronic application – from simple sensor data acquisition, through single axis motor control, all the way up to sophisticated multi-axis motion control. Supported by industry standard development tools and programmable in C/C++, the supplied application software can be easily modified and extended if required.
 
Out-of-the-box solutions

Perhaps the most demanding type of customer is our fourth category. An increasing number of agile companies are providing entry-level application-specific systems that require some element of motion control, at competitive cost points. To succeed in this market, you need to focus on operational excellence and deliver system solutions quickly and efficiently. What you require from a motion control technology supplier is a turnkey solution that works “out of the box”. Hence our name for this type of customer: ‘Black boxers’.
 
Black boxers can use the XMOS Motor & Motion Control Kit to gain access to advanced motion control technology without the burden of a lengthy and expensive development cycle. With ready-to-go software for standards-compliant motion control, there is no need to design and implement the low level control systems; we’ve done it for you! Just plug the modules together, select the software from the pre-configured options and you’re ready to go.
 
The aim of our partnership with Synapticon has been to create a motion control solution that serves the needs of all of these diverse companies and markets. There has never been a more exciting time to be involved in motion control. To find out why, head over to our new motion control news hub, where you’ll find news about developments such as smart manufacturing and embedded intelligence, links to some of the best information sources on the web, as well as XMOS/Synapticon product news. Or check out our product pages for more details on the new range of motion control kits.

XMOS hits the road in China

$
0
0

The XMOS team in China are just taking a breath after three busy days at the Shenzhen IPC & Embedded Expo. With one day left to go before the roadshow event rolls on to Beijing, the XMOS stand has been a hub of activity with demonstrations taking place of our Motor and Motion Control solution, AVB, and various USB Audio applications. 

We are also providing a demonstration of Sony’s PHA-2 Headphone Amplifier, utilizing XMOS xCORE technology. And you'll be able to see startKIT, the low-cost development kit that makes multicore technology accessible to any embedded developer.

As China’s largest event targeting the IPC and Embedded industries, the IPC and Embedded Expo takes in six days of events across four cities during the month of August. In addition to this week’s Shenzhen event, the Expo visits:

August 12th           - Beijing (China National Convention Centre)

August 15th            - Shanghai (Sheraton Hotel Hongqiao)

August 19th            - Chengdu (Crowne Plaza Chengdu City Center)

The XMOS team will be exhibiting at each location, and will be happy to answer any questions you may have about our xCORE technology and its potentia applications.

So why not visit our stand at one of these events and learn more about our range of xCORE devices:

Beijing                     Booth B08


Shanghai                 Booth S09

Chengdu                 Booth C03

Young Engineer XMOS Prize Winner 2014

$
0
0

Following the close of the academic year at Bristol University’s Computer Science faculty– and as we now look forward to new one – it is time to announce and congratulate the winner of the XMOS Prize 2014.

Rewarding the most promising minds each year since 2008, this annual award goes to the top-performing Computer Science Graduate from the four-year degree course. On behalf of the team here at XMOS, I would like to congratulate recent graduate Gianluca Di Maggio as the recipient of the 2014 prize.

Young Engineer XMOS Prize Winner 2014

Since spinning out of Bristol University a few years ago, the team at XMOS have not forgotten our roots at Bristol University. XMOS has now evolved into a major semiconductor business bringing new capabilities to sectors such as robotics, automotive and digital audio. Only last month we announced the completion of a significant new investment round, receiving backing from Bosch, Huawei and Xilinx.

It’s clear that our academic heritage has stood us in good stead. We recognize and reward the skilled engineers who have the potential to shape the world of tomorrow. This award gives us that chance, because each of us remembers being a budding “engineer of the future” and understands how much reward and encouragement contributed to our success.

So, our congratulations go to Gianluca. We wish him the very best of luck in the future and a successful career in his chosen field.

To read about last year's XMOS Young Engineer Prize, click here.

Unapproved

Meet the multicore robots

$
0
0

Predictions from the likes of Raymond Kurzweil, Google artificial intelligence expert and futurologist, have stated that robots will be able to think like humans by the late 2020s. As we move towards a world where embedded intelligence is a far more realistic prospect, it’s great to see real examples of engineers increasing the autonomy levels of robots. One of the projects that caught our eye is a robot that perhaps can’t think for itself but is capable of one quite ‘human’ process – climbing stairs.

 The stair-climbing robot was produced by a team of students at Japan’s Hokkaido Polytechnic using xCORE technology for all of its sensing, motion and motor control tasks. Daichu Kimura and his team have used a single xCORE chip to detect outside conditions, positions and angles with 12 sensors and 8 motors in real time.

The complex embedded system is not supported by a traditional real time operating system (RTOS), and doesn’t require multiple microcontrollers or PLDs. Furthermore, the robot has been developed in just 12 months by the team who had no prior knowledge of embedded real time system development. While this is an incredible feat, it does put into perspective just how accessible robotic engineering has become. You can see a video of the Hokkaido robot in action here.

Other robotics-related projects have cropped up and are in process now. The TwoWheeler project, originally posted by XCore Community user alfanick, aims to develop a self-balancing robot using the XMOS startKIT platform. The robot solves the physical problem of an inverted pendulum using a PID regulator to connect motors with an accelerometer. You can download the latest build of this project here.

Another intriguing project, posted by XCore Community user TSC is to create a tank robot, which forms the top of a three-tier hierarchical system of urban search and rescue (USAR) robots. The three tiers are referred to as the Grandmother, Mother and Daughter robots.

The idea is that the Grandmother robot contains the powerful computer hardware necessary to coordinate the autonomous actions of the Mother and numerous Daughter robots, which can traverse hazardous terrain to detect signs of human life. So, in this, the Grandmother robot, the navigation is carried out by actuators and a diverse range of on-board sensors. These connect to the computer through a high concurrency software-defined interface bridge built using XMOS multicore technology. You can download the latest build of this project here.

We expect to see more developments in this space through the XCore Community and look forward to seeing even more progress made with autonomous robotics projects.

To summarize, perhaps we can put a slightly different perspective on Kurzweil’s. We may still be 10-15 years away from true embedded intelligence, but we are actually only a few steps away.

As XMOS CTO Simon Knowles discussed at this year’s Future World Symposium, true embedded intelligence requires a number of fundamental advances. These include new algorithms, probabilistic decision-making and a move from programming to training; all of which can be delivered only by future generations of parallel processors.

Unapproved

The XCore Exchange Community at Wuthering Bytes

$
0
0
I had the chance of speaking at Wuthering Bytes / OSHCAMP this year along with the pleasure of running workshops on the Sunday. I chose to talk about concurrency in the embedded world and focused on what I termed "the concurrency grey scale". This grey scale has simple low cost microcontrollers at the lower end and highly parallel FPGAs at the other.
My talk centered around the concept of adding concurrency beyond the common sequential tick & interrupt driven design. The XMOS XS1 microcontroller architecture allows a more granular concurrency design based around tasks and logical cores using event primitives at its heart.
The nature of the XS1 event driven hardware combined with the new 13.x interfaces, tasks and pointers provides super fast, optimized granular concurrency out performing generic microcontroller based RTOS designs. This ushers in a new era of granular concurrent microcontroller embedded design cycles that are far more agile than making the leap to FPGA, when low latency high performance is required. The design cycle is based purely on algorithms/make and does not require the mixed disciplines of HDL and a protocol engine. It is relatively trivial for microcontroller and embedded design engineers to step into the new xTIMEcomposer tools from XMOS and hit the ground running.
Overall the talk was well received and slotted in nicely with some of the other weekend talk themes like 'The future of microprocessor' and 'learning to program Parallella'.
For me the most interesting part was the workshops on Sunday, where I had a chance to bring 2 sessions of new engineers/developers up to speed with xTIMEComposer 13. We gave them each a startKIT (and a USB cable) and got them to register and install the tools. We had a mixed group of Linux, Mac and Windows users across both sessions.
The easiest install seemed to be the Mac folks, followed closely by the Windows guys. After some initial issues with the user registration everyone had xTIMEcomposer installed. On the Linux side we had to manually update the udev rules (there is a script to help with this) and soon enough we had the Linux guys up and running. Next stop was a run through of xTIMEComposer using the startKIT "Spinning Bar" tutorial, which everyone completed successfully!
I then took everyone through the new XC 13.x, explaining the port architecture, definition/declaration along with timers and clock blocks before moving on to the event based language center of the select statement. I next introduced Interfaces and took everyone through their definition and use. Following the basic introductions I put all of this together in order to create client server based tasks. Finally I explained the subtleties of combining and distributing tasks in order to make efficient use of logical cores.
We were fairly hard pressed for time in each session, limited to about a couple of hours, but I found pretty much everyone got up to speed very quickly, this intro was enough to get them running on startKIT armed with the basic skills required to start creating their own concurrent code.
 
Overall it was great weekend of hardware and software shared with many interesting participants, not to mention good food and beer from the Hebdon Bridge publicans. I would highly recommend attendance if you get a chance next year, to visit this wonderful Pennine event. I would also like to thank the organizers for inviting me along to talk about concurrency and also XMOS for sponsoring the event and providing the tools to enable it to happen.
<p><b>Fill in your details below and click on the submit button to download Al's presentation slides.</b></p>
<p>Click on the submit button below to download Al's presentation slides.</p>
First Name (required field)
Last Name (required field)
E-mail Address (required field)
Company Name (required field)
Email Updates
Start your download by clicking below
Approved

Launch of xCORE-XA development platform at ARM TechCon

$
0
0

This year at ARM TechCon we launched the xCORE-XA development platform. For sometime now we have been sampling silicon but customers wanted a development platform to evaluate the silicon and we provided this with built in hardware debuggers and all the I/O brought out to pins. People liked the integrated ARM and xCORE debuggers, and the tools suite that presents code entry and breakpoints for each processor technology side by side.

Feedback was good on our demo Industrial networking board. Customers could see how the xCORE processors and I/O met the timing requirements of time sensitive networking whilst the ARM Cortex-M3 was well suited to the higher layer fieldbus stacks with code re-use and extra memory.

Several people asked why we decided to partner an M3 with our xCORE multicore processors. When I pointed out that the nature of the architecture with its low power, low latency, single cycle instruction set has many similarities with the high performance credentials of the xCORE, they could see that there is a firm pairing. The ARM adds a huge code base and architecture familiarity allowing engineers to choose XMOS with confidence.

This year the biggest trend at the show was the Internet of Things ( IoT ), Machine to Machine ( M2M ) has been re-branded as IoT and just about every vendor had a story to tell in this space. Of course what started out as connectivity at a local level now includes a network infrastructure and cloud based apps that connect to the devices. Many of these are targeted at a home setting and it will not be long before my coffee maker is ‘friends’ with my fridge who in turn will order more milk on my behalf.

XMOS demonstrated an xCORE-XA development platform that was connected to the Internet and serving a web browser, which had widgets that could control LEDs on our board. We’ve just published a series of app notes that show how easy it is to add Ethernet to an xCORE processor and start sending packets with a few simple API calls to our library.

The MBED area at the show was a hive of activity and interest with some great demos – I am determined to build the Internet connected Nespresso machine that I saw. And of course the MBED area had the eternal crowd pleaser – beer !

Unapproved

Touching on the world of haptics technology

$
0
0

Haptics has been defined in academia as “the science of applying touch (tactile) sensation and control to interaction with computer applications”. By using input/output devices users can receive feedback from computer applications in the form of ‘felt’ sensations. The potential uses for haptics technology are almost infinite and blogs such as: Haptic Antics and Haptic Feedback are great places to do some further reading.

In this blog post I’ll be explaining some of the specialist language used in the haptics domain and providing an overview of the technologies involved. I’ll be following this up with a series of posts looking at how haptics technology can be applied to markets such as the automotive, consumer electronics and home appliances industries.

As with all specialist groups, the haptics community has its own lexicon, which can be confusing at first. Firstly there are two types of haptic modalities through which a computer can send information to a human.

  • Tactile modality: This refers to a sense of pressure. From the perspective of a human interacting with haptics technology, it is the sensation of feeling something through your sense of touch, even though nothing has touched you.
  • Proprioception modality: This refers to the perception of body awareness – the sense of knowing something is there without seeing it or using your other senses.

Secondly, here are a couple of key terms you are likely to hear if you’re planning to attend any haptics conferences any time soon:

  • Rendering: Borrowed from the vision-processing industry; in haptics this refers not to the shading and lighting of an object in the computer, but to the process of giving users the 3D sensation of a real object when they are presented with a virtual reality representation of it.
  • Cues: An important factor in giving the brain enough information to work with. For example, a proprioceptive cue is about knowing (without looking or using your other senses) where some part of you is in space.

How does haptic technology work?

There is much innovation in the transducers used to support haptics technology. This includes making tiny actuators, surface coatings, magnetics, needles and high voltage electrodes. Behind that, in most cases you will need one or more microprocessors or FPGAs to render the haptic effect, and finally PCs to configure the haptic or create a virtual world.

So the needs of the haptic processor are:

  • Multiple channels of A/D conversion
  • Extremely low latency in feedback loops
  • Closure of control loops at 1, 4, 8kHz and more
  • DSP to perform the maths
  • Interfacing to secondary micros or a PC

If you are familiar with xCORE multicore microcontrollers, you’ll recognize many of those features, and indeed it’s no surprise that more than one company is already using our devices for haptic applications. Using xCORE you can comfortably close feedback loops at the required rates – in fact in the lab we have achieved closure at up to 18kHz, allowing the production of an even better user experience;  and interfacing to a multitude of communications standards including USB is straightforward.

I find haptics technology fascinating: I’m sure in the coming years we will (literally) find it ‘touching’ our everyday lives more and more often.

First Name (required field)
Last Name (required field)
E-mail Address (required field)
Company Name (required field)
Email Updates
Unapproved

How to optimize start-up time for automotive AVB

$
0
0

XMOS recently attended the 2014 IEEE-SA Ethernet & IP @ Automotive Technology Day in Detroit. The event is in its fourth annual edition and was held for the first time in the US, after previous events were held in Germany. It was also the first time the event was organized through the IEEE Standards Association, a sign of the growing applicability and maturity of Ethernet as a technology in the automotive industry.

The theme of the event was “Moving towards a mature and pervasive automotive network: from infotainment to autonomous driving, how Ethernet is uniquely qualified to transform the vehicle”.  The two-day conference program showcased a number of themes, including the status of IEEE standardization efforts and physical layer developments that will enable an automotive Gigabit Ethernet network. New applications and use cases of the technology, security aspects, and testing methods and tools were also presented.

I co-presented on "AVB in Automotive Infotainment Networks” alongside Günter Dannhäuser from Daimler. The presentation focused on work that we have been doing with Daimler to reduce the startup time of XMOS AVB endpoints to meet a number of use-cases in automotive. Through a combination of hardware and firmware optimization, we presented a reduction from 7 seconds to just over 500ms from cold boot to first audio output.

You can see our presentation slides here.

A new aspect of the 2014 event was a joint session with the 7th AUTOSAR Open Conference where the software aspects and the Ethernet integration into the AUTOSAR software platform were presented and discussed. Marc Weber from Vector Informatik presented a roadmap to integrate AVB driver capability in a future version of AUTOSAR.

The event was a unique opportunity for OEMs, suppliers, semiconductor vendors and tool providers to come together. In parallel with the conference, XMOS exhibited alongside forty other companies, which allowed participants to experience the technology and directly interact with vendors. We showed XMOS AVB Daisy Chain endpoints, streaming audio to and from an Apple Mac, via a single twisted pair using Broadcom BroadR-Reach PHYs. A number of other vendors were showing XMOS endpoints interoperating with their equipment.

With the recent innovations in vehicle electronics in the Infotainment, Active Safety, Powertrain and Body domains, there is greater need than ever for a new generation of network that can scale with the bandwidth, time synchronization and quality-of-service needs of endpoints. There was consensus at the event that Ethernet and AVB/TSN meets these needs and will be prevalent in cars in the coming years.

Unapproved

Multicore programming: XMOS shows how at electronica '14

$
0
0

This was my first time at electronica... it is a large show, to say the least! That said, once you’re in the right area, there are many relevant stands conveniently placed close by. That area for me, and the rest of the XMOS team, was on our distributor’s stands Macnica and Topas. Some of the notable discussions we had related to AVB and its adoption in automotive networks. This was a common theme with many other stands presenting information on Automotive Networking. In addition to the automotive applications, I also found out about a really interesting AVB application, implemented using XMOS, involving a network with hundreds of endpoints. Watch this space for more details!

On the Thursday of the show, Elektor Magazine had invited us to present a couple of workshops on multicore theory, applications and programming.  This was a great opportunity to get people involved with XMOS hardware and show how to solve real-time applications.

 

The aim of the workshop, after the initial introduction to the architecture, was to create a servo motor server with the angular position of the motor controlled by a capsense slider on an XMOS startKIT. For those eager for the challenge, there was also an extended task (for homework!), to connect two startKITs/servo motor servers using CAN bus and synchronize the positions.

This was actually quite easy and within an hour there were motors zipping backwards and forwards as the capSense slider was swiped. The startKIT is a great platform for this type of development. The peripheral hardware support is supplied as an easy to use source code library allowing fast implementation, and there is plenty of GPIO accessible on the header to connect additional peripherals.

Hopefully, the attendees learnt some of unique benefits of multicore architectures in solving real-time embedded problems and enjoyed the hands-on exercises.

Many Thanks to the Elektor team for the invitation and hospitality during the workshops.

<p><b>You can download the presentation slides from our session by filling in your name, email address and company details below.</b></p>
<p><b>Download the presentation by clicking the submit button below</b></p>
First Name (required field)
Last Name (required field)
E-mail Address (required field)
Company Name (required field)
Email Updates
Thanks for downloading the training presentation. If you haven't done so already you can download xTIMEcomposer Studio free-of-charge by visiting the support section of the website.
Unapproved

Eurohaptics shows the advantages of touch

$
0
0

The annual Eurohaptics conference earlier this year in Versailles saw experts from around the world gather to discuss their research in this field; 2014 marked the ninth occasion this conference has run.

There was a wide spectrum of interest ranging from academic research into human perception, to industrial applications of haptic devices that will ultimately enhance our world.

Daimler Benz, in association with Continental, was showing the haptic interface that is central to the new Mercedes C class.

The armrest doubles as a haptic input device. It knows if you are leaning your arm on it and ignores that; but point a finger at it and it becomes a control surface like your laptop trackpad. When pressed, this area ‘feels’ like a button, even though there is no real button there. It is a great example of haptics in action. A transducer moves the plate by a tiny amount with a special excitation profile that makes the sensation of a greater movement than actual.

It turns out this button press application is becoming a distinct class of haptic device because at least two other organizations were showing similar effects. Aito demonstrated the use of miniature piezo transducers to create the button press ‘feel’ in panels of solid aluminium. This has great application for controlling white goods and medical instruments where the surface can be hygienically smooth and clean. An interesting extension to the actual sensation the system produces is the role of audio cues in fooling the brain into the perception of movement. For me this was most pronounced in the demo provided by Redux, where the same physical haptic was accompanied by an audio ‘camera flash’ noise. With the volume turned up, the button actually felt different, and to me it seemed that my finger had travelled further – even though I was still pressing what I knew to be a solid aluminium plate!

As with all specialist subjects the haptics community has its own lexicon, which can be confusing at first. When describing subjective things like feelings, some much more accessible words are in play (fluffy, rough, slippery, etc). A paper jointly presented by NTT and the University of Electro-Communications in Tokyo showed how the onomatopoeia of repeated words in the Japanese language is so strong that one can build a computer to quantitatively analyze the feeling the words conjure. Maybe this will be used in the future to accompany other haptic effects with an audio phoneme that enhances the sensation.

When it comes to ‘slippery’, there is a lot of research going into recreating an artificial haptic of friction; the Holy Grail is being able to render different surface textures. Of course many people see the benefit of taking existing technology like tablets and adding sensation to their surfaces. This is a tricky one. Can you really make a piece of glass feel like wood? Many are trying; two researchers have been experimenting with ultrasonically moving the surface of a piece of glass in an attempt to change its friction profile. Subjectively I would say that so far they have managed to make glass feel like “regular glass”, and more like slippery ‘wet’ glass when the ultrasonics is applied.

Rendering of texture takes many other angles in its pursuit. One demonstration showed how the edge enhancement of a video image could be converted into electrical stimulation, to emulate the sensation of feeling those edges.  This relies on an effect of electro vibration. “Are you electrocuting me?” Well, yes, sort of.

Phantoms are another commonplace item in the haptic world. These are force feedback devices that create the feel of an object in 3D. The phantom is essentially a robot arm; when the user tries to push against the arm, its motors are used to prevent movement around a virtual object or space. By varying the nature of the force feedback, different sensations can be created.  One can pick up a virtual object and feel the weight of it, use a virtual tool on a virtual object and feel how elastic it is, or even push right into it and feel how viscous it is.

Haptics technology has the potential to have a major impact on numerous industries: gaming, medical technology, automotive and consumer technology just to name a few. But as the real world uses become more apparent, haptics technology will need to increase in sophistication, and the companies providing the supporting technologies will need to work hard to remain ahead of the game.

Unapproved

Next Generation xCORE

$
0
0

xCORE-200

As you may have read, after several months of hard work, we are pleased to announce the arrival of a new member to the XMOS xCORE family; xCORE-200.

Delivering up to twice the performance and four times the on-chip SRAM memory compared to our first generation xCORE multicore microcontrollers the xCORE-200 also adds a Gigabit Ethernet port, high performance and programmable XMOS USB 2.0 interface and up to 2Mb’s of on-chip Flash memory.

Some customers have already received early samples of the evaluation board; eXplorerKIT.

Our engineers have also been putting the eXplorerKIT through it’ paces and the feedback from both external and internal groups has been overwhelmingly positive.

You can read our full press release here: www.xmos.com/news/press/18349

Or check out the xCORE-200 product pages here: www.xmos.com/xCORE-200

xCORE-AUDIO

We are also introducing xCORE-AUDIO, our first range of application specific evaluation boards.

Built specifically to meet the demands of both high- resolution consumer audio and complex multichannel professional audio applications, xCORE-AUDIO comes in two flavors:

The first of these to go live is the stereo Hi-Res-2 and will be available for as little as $2.00 in high volume.

You can read out full press release here: www.xmos.com/news/press/18350

Or check out the xCORE-AUDIO product pages here: www.xmos.com/xCORE-AUDIO

NEW WEBSITE

Last but not least, you may have noticed that the website has been completely redesigned!

Following extended feedback regarding our previous website design, we have re-launched it with an entirely new menu system and page layout. Most importantly, menu’s on each page are now persistent, giving you a much better indication of where you are and what is available to you.

We’ll post up again soon with details of all the website changes and where you’ll find things in the new site, but hopefully the new design should be a lot more intuitive.

We welcome all feedback on the new website design. Just leave a comment below.

Unapproved
Viewing all 78 articles
Browse latest View live