Posted by - filed under iot, physical.

As you will know, we’re actively contributing to the Oktoberfest of Things. Our smart beer table is now pretty much ready and we are just waiting for some revised beer coasters to arrive. It’s a crazy project, but still has sone commerce background (automatic reordering or process improvements). Still, the focus here is on engaging with the community of course. We’ve shared all our source code and design files via GitHub, check it out here. The code is under GPL and the images are Creative Commons BY-SA. The current features include:

  • double-tap to send a “tap” event. Currently this will highlight the beer mug in the UI, e.g. call the waiter
  • liftup the mug to send the up event. Whenever this happens, the liftup count for that mug is increased
  • put the mug back down – now the new fill level is calculated and animated based on an average  30ml per liftup/sip.
IMG_20140808_164821-EFFECTS

 

In this post, I’d like to outline the technical architecture for this project. For the first time, we’ve used a Spark Core for one of our projects. The Core’s Wifi reliability has greatly been improved over the last weeks and we’re now able to work with it in our enterprise environment.

Take a look at the architecture below, which is pretty simple:

 

architecture

 

If we start on the left, you can see the beer coasters with integrated pressure sensors. In reality we got 8, which is a good number of people to have around a beer table. The pressure sensors are hooked up to the 8 analog inputs of a Spark Core to figure out of the mug is above the coaster or not. We also use the pressure sensors to listen for the taps. The core then communicates (transparently for us) with the Spark Cloud and our web UI will open a SSE (Server-Sent Events, like a one-way WebSocket connection) via Javascript. All logic to update the UI is currently in JavaScript in the browser, the HMTL page that hosts the Javascript could be a static one.

Let’s take a look at some snippet of firmware that runs on the Spark Core:

void loop() {
    if(current < CVALUES)
        current++;

    for (int mug = 0; mug < 8; mug++)
    {
        bool newState = readBool(A0 + mug, 2);
        pushValue(mugVals[mug], newState);

        if(checkDoubleTap(mugVals[mug])){
            eventDoubleTap(mug);
        } else {
            bool isEqual = false;
            int i = 1;
            while(i < CVALUES){
                if(newState == mugVals[mug][i]){
                    isEqual = true;
                } else {
                    isEqual = false;
                    break;
                }
                i++;
            }

            if(isEqual && newState != states[mug])
            {
                states[mug] = newState;
                event(mug);
            }
        }
    }

    delay(DELAY);

}

In our main loop, that is executed over and over again just like on an Arduino for example, we loop through the 8 mugs. Max Schrupp, on the labs team, greatly improved the code just a few days back and we now detect the taps in a pretty good fashion. For each mug, we determine the current state (up/down) and push that value into an array of states for each mug. We store up to 8 past values for each mug this way. We then check for two things: is it a double tap? and if not: has the value changed (we compare against all values, so all 8 stored values per mug need to be switched).

Sending the events will probably be the most interesting part for you. Luckily this part is easy, thx to SparkDevices:

        char msg[63];
        if (states[mug]) //mug down
        {
            sprintf(msg, "{\"id\":%d,\"d\":\"down\"}", mug);
            Spark.publish("mug", msg, 60, PRIVATE);            
        }
        else //mug up
        {
            sprintf(msg, "{\"id\":%d,\"d\":\"up\"}", mug);
            Spark.publish("mug", msg, 60, PRIVATE);            
        }

Here, we check the state of the mug in the states array and then use Spark.publish to send the up or down event. The data we send is a snipped of JSON that fits nicely into the 84 bytes max for the Spark publish function.

Within the website that representes the UI, a small piece of javascript receives these events.

	var url = 'https://api.spark.io/v1/events/mug?access_token=' + localStorage.spark_token;

	var source = new EventSource(url);

	source.addEventListener('mug', function(e) {
	  var msg = JSON.parse(e.data);
	  console.log(msg);
	  var data = JSON.parse(msg.data);

	  if (data.d == 'up')
	  {
	  	if (!liftups[data.id])
	  		liftups[data.id] = 0;

		liftups[data.id]++;

	  	updateLiftups(data.id, liftups[data.id]);
	  }
...

All the magic, like sending the events via a WiFi connection up to the Spark Cloud and down to the web UI via SSE is handled by Spark.

 

screenshot1

 

Please do checkout the GitHub project for details and full source code – which includes the web code (HTML, CSS, JS, plus images) and the Spark Core Firmware (see the sparkcore directory). We will most likely improve the code over time, but our next stop is the #iotcon in Berlin, where we will demonstrate the smart wine shelf and our brand new #oktoberfestofthings beer table! I also have two sessions, one about our IOT prototypes and the other will be about Google Glass. In addition, I am co-moderating the hardware night on September 2 and we’ll have a look at many IoT devices.

Be the first to comment this post

Posted by - filed under 3D, arduino, hardware, hybris, physical.

As you might have heard, we’ve opened the hybris labs space in Munich. This is a show room where the labs team will be able to showcase the prototypes we’ve built over the years and potential new customers as well as our partners and existing customers can have look, get inspired, disucss new technology and concepts. We’re still working on the details on how to maintain that showroom, but we’re already collecting the gadgets and the prototypes that we’ve built over the years. It seems to pay off that we “became physical” (jaja) very very early. It simply means we have stuff to put into those rooms. From wine shelves that show the latest around IoT to in-store displays that use gestures to navigate or iBeacons to identify customers.

The making of hybris labs' logo

 

But this post is not about these prototypes, it’s more that I realized that pretty much everything in the hybris labs space has story. A real story, one that you do not get when you “just buy it” – via external services or off-the-shelf. Sometimes a painful one, a story of hard work and failure as things did not turn out the way we wanted. But very often also stories of great exploration and experimentation combined with good returns – it very often really payed off when it comes to the reach we’ve achieved with our prototypes.

So everything has a story. And I could not resist to give our #hybrislabs logo, the logo that visitors will first see when they enter the space, a special story. Just ordering it would have been too easy and really hybris labs. So…. as me and my family are new members of the FabLab Munich, I decided to 3D print, laser-cut and then electronically coolify the logo.

 

IMG_20140729_231907

 

Some people have asked me for a step-by-step guide, so I hope to at least shed a bit of light into the process of making such a logo. I know for example, that Andreas Kopp of FabLab Munich Fame and Pixomat wants to recreate such a logo. It turns out that such a project is a great introduction to 3D printing, laser cutting and also electronics.

The logo was created in 3 steps:

  • 3D-printing the housing – each letter forms a little box with around 1.5mm wall width. I’ve used a 2D SVG file from Inkscape and imported that into AutoDesk 123D, later on exported it to CURA for printing on an Ultimaker 3D printer. This process takes incredibly long. Each letter needs to be printed and even though I combined a few per print-job, I estimate around 20 hours in total. That’s time you mostly have to surveil the print job, but still it’s time you have to spend. 
  • Once we have these “containers”, I’ve added WS2812B RGB LED strips to the letters. This means a lot of cutting, drilling holes for the in/output lines and especially soldering. This process takes around 5-6 hours depending on your soldering skills.
  • Finally, after everything is electronically connected, tested, glued to the bottom of the 3D-printed letters, it’s time to laser-cut the top elements out of milky acrylic glass. As the laser cutter is a very exact machine, you can simply measure the length of a letter that got 3D printed and scale the lasercut design accordingly. I first created a MDF (wood) prototype of these top elements before I lasercut the more expensive acrylic glass.

 

Let’s take a more detailed look.

 

3D-printing the housing

For this, I started with a 3D SVG file (vectors) for the hybris labs logo. I got this a while back from our designers, SNK in Munich. I imported this “as a sketch” which means go get a flat surface in 123D.

 

Screen Shot 2014-08-01 at 08.50.11

 

Next, I extruded that logo 3cms high. Now I had little solid elements for each letter.

 

Screen Shot 2014-08-01 at 08.52.13

 

To create containers, I selected the top surface element and chose “shell”. It will allow you to specify the wall thickness, which is very important for 3D printing. Choose a wise and thick shell. I ended up scaling it down later on quite a bit, which reduces the shell thickness. I chose 3mm which worked well, even after scaling down in 123D by 0.5 and in CURA again by 0.75 (just my numbers).

 

Screen Shot 2014-08-01 at 08.53.04

 

You will now likely have to create a few .STL exports of that file because at least the 3D printers I had access to cannot print a meter long logo. So I ended up having around 5 print jobs. Actual printing, creation of the 3D printers GCODE was doen with CURA. That tool imports an STL file and exports the GCODE onto a SD-Card which goes into the 3D printer (like Ultimaker).

I will not cover all the settings in CURA, as this might also be a bit specific to the 3D printer you have.

 

Screen Shot 2014-08-01 at 09.02.45

 

IMG_20140723_150903

 

Making it Blink

At this point, we have our 3D-printed containers and we have to add RGB pixels to them so we can later control them with an arduino. The ingredients here are:

  • Access to a drill so you can drill holes into the containers.
  • WS2812B strips, like these from eBay. These can be controlled with e Adafruit NeoPixel library.
  •  3-wire cable for connecting all the pieces
  • soldering iron, solder
  • Arduino or compatible microcontroller, 5V power supply, power depends on how many pixels you drive. Each pixel has 16 mA I believe… in our logo we’ve got around 100.
  • some wine to smoothen your feelings. This step takes a lot of accuracy, time, patience.

Once I had figured out the best way to connect each letter, I cut the WS2812 RGB strip into pieces and put the pieces into the containers.

 

IMG_20140729_193129

 

Next up is the soldering. Each pixel strip has VCC, GND and the Data in/out pins. Essentially the three wires are connected to the beginning of each strip part, then connected between the parts. It is very very important, that you test the pixels after each soldering step. The problem is that some pixels might be bad and if you have such a pixel in the middle of your project, you will search endlessly later on. So solder one piece, then try it out immediately. For testing, simply import the Adafruit NeoPixel library into the Arduino IDE, load the strandtest example and add the right values for pixel count and data pin. Do not forget to connect the GNDs of your power supply and the Arduino GND, plus place a 330Ohm resistor before the first pixel’s data input. Also, it is recommended to use a capacitor across the power supply to smoothen spikes when the power supply is turned on.

 

IMG_20140729_193248

 

Laser-Cutting the top elements

The final step is a rather quick step in the overall process. Using a laser-cutter, I’ve cut out the top elements that go on top of the housing. The good thing about a laser cutter (compared to a 3d printer I feel) is that it is very exact. So it makes sense to 3D print some parts first, measure them, then adapt the lasercut design. Still I first used cheap MDF / wood to try out if the measurements are really correct, then I used acrylic glass for the second run.

 

IMG_20140723_190338

 

I simply measured the length of the “l” in “labs” and adjusted the size of the labs logo based on that length. I used two supporting lines / rules in Inkscape for that (sorry, no screenshot available for that). With the cutter we have at the FabLab Munich, I then had to save the file as a pdf and import it into Corel Draw which is the vector program setup on the lasercutter PC. Laser cutting itself is a pretty easy and quick process – once the speed and intensity of the laser are checked the actual cutting just takes a few minutes.

Finally, the elements – bottom 3D printend and full of electronics and the top parts which are laser-cut – need to be glued together. To hold everyhing in place I’ve used these clips that you normally use to hang up your wet clothes…

 

IMG_20140729_234431

 

Et voilà

I realize not every step might be 100% clear at this point, but I hope it gives a good overview. It’s the best I came up with over the last 45 mins and I need to move on to a few other things. Please comment or tweet me for questions, share it, thx a lot for reading!

Be the first to comment this post

Posted by - filed under iot.

 

 

 

 

 

We recently finished our first IoT prototype, the smart wine shelf. It was a big success and it enabled us to have many many insightful discussions with people in the IoT industry and our customers. After a bit of thinking, we realized that it makes a lot of sense to add a few more prototypes in this area. For the next one, we’ll be focusing on the sensing part, e.g. how can we detect customer’s interest in products for example. In the end, the idea is to research how we can gather analytics events in a similar fashion as on a website. On a website, using tools like Google Analytics and others, we know exactly when our customers visit us, what product they look at, what products they take a closer look – and finally when they leave the store again. In between they might have bought a product or not – and if they are soon to leave, we can think about providing incentives to buy: promotions for example.

On a website, the profile you build up yields to real-time changes to the pages you visit. Why is this not happening in the physical world? For example, why does the music in a store not change based on the demographic profile of the people visiting? Why do video ad solutions not adopt to the people in the store? You can add other senses like smells or lighting to this.

Why do we not have the same for the physical retail space? We’ll, we believe with the help of sensors and iBeacons for fine-grained location, we can do exactly that. We’ll take a look at this in our next prototype, called Funky Retail. More details soon, but let’s start discussing this now - as always on Google+.

 

funky_for_blog
Be the first to comment this post

Posted by - filed under APIs, hardware, iot.

I am attending ThingsCon right now and was in a workshop with Zach Supalla, the CEO of Spark. Besides hooking up a brand new spark core to the cloud and sending Server-Sent-Events down to a Web Page, I also finally figured out how to consume these events from a regular node.js program (and not *just* a web browser…). Here’s a quick tutorial to show you how easy it is. I am assuming you have a spark core ready and connected via WiFi.

IMG_20140502_130400

 

First, the spark firmware. In this case I connected a very simple PIR motion sensor – you could change that and just send a publish event every few seconds.

unsigned long lastTime = 0UL;

void setup() {
    pinMode(D0, INPUT_PULLDOWN);
}

void loop() {
    unsigned long now = millis();

    if (now - lastTime > 30000)
    {
        lastTime = now;
        Spark.publish("ping", NULL, 60, PRIVATE);
    }

    int val = digitalRead(D0);

    if (val == HIGH)
    {
        Spark.publish("motion", "some data here", 60, PRIVATE);
        delay(5000);
    }

    delay(50);
}

You can see that a ping event is sent every 30 seconds, and a motion event only if there is motion detected. Very simple, right?

Let’s move on to the node.js code. I initially used the node.js EventSource library, but this did not work. While got event notifications from the library, the events themselves were empty objects. At #thingscon today I looked closer at the Spark Command Line library and finally ripped out the essential parts for subscribing to the SSE events from that lib. The result is this small script. It mainly connects to the Spark SSE source via the request library, then whenever some data is available, it parses the data according to the SSE spec (which is pretty much line by line, plus stripping off the event: starting strings).

var request = require('request');
var extend = require('xtend');

var requestObj = request({
            uri: 'https://api.spark.io/v1/devices/xxx/events?access_token=xxx',
            method: "GET"
});

var chunks = [];
var appendToQueue = function(arr) {
    for(var i=0;i<arr.length;i++) {
        var line = (arr[i] || "").trim();
        if (line == "") {
            continue;
        }
        chunks.push(line);
        if (line.indexOf("data:") == 0) {
            processItem(chunks);
            chunks = [];
        }
    }
};

var processItem = function(arr) {
    var obj = {};
    for(var i=0;i<arr.length;i++) {
        var line = arr[i];

        if (line.indexOf("event:") == 0) {
            obj.name = line.replace("event:", "").trim();
        }
        else if (line.indexOf("data:") == 0) {
            line = line.replace("data:", "");
            obj = extend(obj, JSON.parse(line));
        }
    }

    console.log(JSON.stringify(obj));
};

var onData = function(event) {
            var chunk = event.toString();
            appendToQueue(chunk.split("\n"));
};

requestObj.on('data', onData);

I we assume the node.js program is saved to sse_request.js, you should be able to run this program on the command line via

node sse_request.js

I totally want to note that the code is really ripped out of the excellent Spark CLI – which I recommend to have installed for anyone doing stuff with the spark core. It is a great little helper.

Be the first to comment this post

Posted by - filed under hardware, hybris, iot, mobile.

We’re close to present a new #hybrislabs prototype to the world, this time around IoT and Retail. Now that the technical design is pretty much nailed down, implemented and to a large degree well tested, I figured it’s the right time to write a blog post about it. We will first show this new prototype at the hybris North America Customer Days 2014 in Chicago early May.

You’ve seen me writing and presenting around IoT before and the smart wine shelf is really the project I used to explore and learn most around IoT. So what is the smart wine shelf? Well, a wine shelf (to some degree), connected to the Internet. Here’s a picture:

smart_wine_shelf_01

 

Why would you connect a wine shelf (aka an “element in a retail space”, hey commerce!) to the internet? Two reasons:

  • customer experience and
  • analytics

Our wine shelf improves the customer experience, as it will show you what wine fits your taste. We’ve created a mobile app which allows customers to walk through a quick wine test, then send the wine profile down to the shelf. The right selection of wine, according to your profile, lights up. Whenever to pick a bottle – either with a previously sent profile or not – you see details about the wine, it’s winery, which food it goes with. If the shelf has a profile, it will also show you why this wine fits: because it fit’s your overall style, acidity profile, etc.

 

IMG_0002

 

And for analytics – whenever a customer lifts a bottle, either with a previously performed wine test or not, we send that data to the cloud where we analyze it. This analysis is used for two displays, as you can see in the picture. The first display is presenting the analytics dashboard of the wine shelf. It would normally not be close to the shelf ifself, we just put it there for demonstration purposes. The second display will provide details about the wine that is taken out of the wine shelf.

 

IMG_0005

 

Making our life easier

While this is a great prototype, we needed to make our life a bit easier here and there to actually be able to create it. This is one general thing I learned during the last years and while we’ve been creating more and more prototypes. You concentrate on the main task and simplify the rest. The main task is: figuring out IoT. This means how is hardware connected in the best possible way – stable, fast and secure. We neither want to create the perfect shelf it self (e.g. space for multiple bottles each) nor the perfect “physcial selection mechanism” (we might have better placed the light rings above the wine bottles – but it needs to be portable and all components go into one platform for a bottle). Also using switches for input was mainly chosen because it provides a stable and easy input mechanism. There are other ways: ultrasonic distance sensors, pressure sensors, you name it. That’s not the point here.

Be your own customer

Being myself an absolute idiot when it comes to wine, I’d honestly believe this shelf would provide myself a lot of value (what is your experience with wine?). The real power of this can be realized when you stop thinking about the specific implementation of this prototype – the wine shelf. What if every retail space could become smart? What if you could use these to see which foods you can eat based on allergies that you have? What if the data gathered from these retail elements could be used to provide realtime information for in-store advertising?

The technical stuff

Let’s investigate a bit how the shelf was actually put together and which components have to work together. The following diagram is meant to help guide us:

Technical Wine Shelf - Firmata
  • iBeacon & smartphone app: A smartphone app is used for the wine test. The app then allows the customer to send her wine profile to a cloud API for all wine shelves. Part of that HTTP call is the id of the wine shelf, which we resolve via iBeacon proximity sensing.
  • All logic resides in the cloud and is accessed via a RESTful API: Once a profile was received and no one else currently uses the shelf to explore his wine taste, the logic (written in node.js) will match the customer profile against the wines in the database. If there are matches, it will look up the wine shelf and determine which positions in the wine shelf need to be highlighted. It will then use a ZMQ publishing queue under the topic name of the wine shelf to send the highlighting data to the shelf.
  • A Raspberry PI and an Arduino are the physcial, non-cloud elements that make the shelf smart: The Raspberry PI acts as subscriber to the cloud which publishes the selections. It was chosen mainly due to (network/internet)  stability reasons. We’ve explored Spark Core, Electric Imp and a few other options (like Espruino with a CC3000 module, Arduino with Ethernet Shield), but none was stable of fast enough to get the job done (this is changing quickly, but at the time we based a decision this was the case). The Raspberry PI’s main job is to act as a broker of data between the hardware and software world. It interfaces the Arduino, which is a well-known prototyping board. The Arduino handles the “last mile”, or “last centimeters” which is the actual control of the lights (output for selection)  and switches (input for analytics). Raspberry PI and Arduino also have to understand each other – we’ve used the so-called Firmata protocol to let them both talk to each other. It’s a blazingly fast serial protocol and allows us to connect the Arduino to the Raspberry PI via a single serial USB connection.
  • Switches and NeoPixel rings: these are very basic electronic components for the input and output of the shelf. There is almost no drama here, but I tell you that soldering 16 NeoPixel rings which each have 4 pins to solder plus the switches created a mild headache for a few days. The pic below also shows these 16 platfors, each housing a NeoPixel ring and switch for one bottle in the shelf.
IMG_20140403_122039

Because we both send data down from the cloud to the hardware and collect data from the hardware and send it up to the cloud, our wine shelf is really an excellent example for my current valid defintion of IoT: a thing connected, able to receive and send messages from/to the cloud, able to process data and able to gather input and perform output.

You need a team for this

This is where I am really proud of #hybrislabs. A project like this requires so many diverse skills, no single person can get that done in a reasonable amount of time without going crazy. We’ve truely worked as a multidisciplinary team and many people with very different skills (mainly from software) have contributed to make that happen. Thx to Bert for the great iOS app, thx to Scott for figuring out the perfect bottles on top of the platforms, thx to Elke for designing the physical shelf, thx to Uwe for iteratively working with me on the platforms that house the switches and NeoPixel light rings and of course thx to Nick for already creating a script for a soon to be produced lab video, collecting and curating all data used (wine data, profile, matching, etc.) and discussing a lot with me about how to best build the shelf. Of course thx to Paul for many discussions and also letting us do this.

Global Making

I mentioned the hybris NA Customer Days 2014, where this will be shown for the first time. I happen to be in Munich, Germany and the event is in Chicago, USA. This means on top of all this, we’re also producing the wine shelf in the US while I will bring the platforms that house the components, cables, computers the day before the event. This is also the reason, you see so many connectors between these component platforms: It all needs to be disassembled, put into my travel bag (airport security will love me) and assembled in Chicago for the event. So a big thx to Tom in the US, too! After the event, we will build the shelf a second time for our permanent lab space in Munich. I expect the second whine shelf to take a fraction of the time needed for the first.

Like always, let’s discuss IoT in the hybris Google+ Community!

 

Be the first to comment this post

Posted by - filed under android, conferences, hybris, mobile, wearables.

My second talk last week at the Mobile Tech Conference in Munich was about Google Glass. At hybris, we’ve explored the two main ways to get content onto Glass – using the Mirror API and also native Apps using the Glass Development Kit (GDK). Of course we’ve been applying our retail-focused use cases to both of them. Have a look and comment in the G+ hybris Technology group.

Be the first to comment this post

Posted by - filed under Uncategorized.

This is a presentation I gave at the Mobile Tech Conference last week – it was very well attended, the topic seems to be getting more and more attention. From labs, we’re working on a few IoT applications in the retail and commerce space, too early to talk and write about, but the presentation is really an effort to bring to together all our findings and learnings so far. Happy to discuss in the hybris G+ Group.

 

Be the first to comment this post

Posted by - filed under arduino, BLE, hybris, iot.

It’s been a bit quite in here for the last weeks – we had to recover from the awesome hybris Customer & Partner days 2014 :-) We’ve not been sitting around and doing nothing, we got in touch with a lot of stuff, mainly hardware. We’ve been preparing for 2014 and IOT is #1 on my list of trending topics. In 2013 we’ve already built several connected prototypes – “things” that were able to communicate in some sort with the internet. We even added Facebook to the game and posted the winner alongside a coupon on Facebook. All these things had commerce-driven ideas (couponing) at it’s heart. That was cool, but now we’ve gotten deeper into the microcontroller space and actively investigate IOT – the Internet of Things.

We’ve just been advising and speaking at the Mobile Tech Conference in Munich, which for the first time dedicated a full day to IOT. And even the evening was filled with IOT because we had an open house for the IOT Munich meetup, right at the conference. I am very much looking forward to the IOTCon in Berlin later in September, where I’ll be helping to organize the tracks and topics as an advisor. And I am pretty sure I have to tell the one or other story, too.

So what is the internet of things? Definitions have been changing, but this is the one that’s fitting for me right now (also see the illustration below):

A thing on the internet is connected to the cloud/internet, so it is possible to send messages down to the thing and also send data back to the cloud. Also, elements of the internet of things have their own processing power so they can react and work with the data received or create data to send back to the cloud. Things are also connected to the physical world via input (sensors) and output (visual, electrical, mechanical, etc.).

Screen Shot 2014-03-19 at 16.28.44

 

The definitions of IOT have changed a lot, from simple NFC/QR code based tagging systems to connected electronics with a steady connection to the internet. But from observing the market and the current new products coming out, it seems an IOT in which all devices are truly connected to the internet or at least some intermediary hub (smartphone, etc.) seems really feasible.

The question for hybris is: what does it mean for retail and for commerce in general? I believe the impact will be huge, as we’re able to capture a lot of currently untracked data about customer behaviour using sensors. Also, we are able to interact with the physical world and change it (via selection, lighting, sound, motors). Combined with proximity services like iBeacon, or paired with NFC some really smart customer-focused use cases become possible.

We’re working on stuff, and I’ll share more over time and once we’ve hardened a few ideas and designs.

Be the first to comment this post

Posted by - filed under BLE, mobile.

We just had the hybris customer and partner days in Munich, a full week of hybris labs duty and we ended the week with an open spaces un-conference with our partners. This is also the reason why it did become a bit quiter on this blog over the last few weeks and christmas. Nevertheless, we’re back. During the open spaces conference I had the pleasure to join a iBeacons/electronics/BLE/all cool stuff conversation and among the guests was Alex Sbardella from Red Ant. He mentioned that one could build an iBeacon using a Raspberry Pi and a BLE dongle and was kind enough to follow up with the link to the original article this was mentioned in from Adafruit. I was able to install this on one of our PIs here and as I had a few minor issues which I was able to solve in the end I thought this makes an excellent blog post for the techblog. Here we go, self-made iBeacons with Raspberry PI.

For the following, I assume you have the Raspberry Pi setup with Raspbian, the default OS of the Pi. Also, you will need a BLE dongle and as Alex noted, there seem to be differences. I was ordering this BLE dongle here that works for me. One should maybe note at this point, that the total price of all components included in this setup will be higher than a single Estimote beacon. But this is for dev and the value herein is the flexibility to define the iBeacon the way you want it.

So let’s start with installing some libraries that will be required. And just to make sure, you should update the package list and upgrade the apt system, too:

sudo apt-get update
sudo apt-get upgrade
sudo apt-get install libusb-dev libdbus-1-dev libglib2.0-dev libudev-dev libical-dev libreadline-dev

Next, we create a new directory for the bluez bluetooth utility that will later be our coomand line tool for the bluetooth dongle. Download the bluez build 5.11 or feel free to explore a more recent version if you like.

sudo mkdir bluez
cd bluez/

sudo wget https://www.kernel.org/pub/linux/bluetooth/bluez-5.11.tar.gz
sudo tar xvf bluez-5.11.tar.gz
cd bluez-5.11/

Before we can get the bluetooth dongle up and running, we need to build the software from scratch. This will take a while, so grab a cup of coffee in between:

sudo ./configure --disable-systemd
sudo make
sudo make install

At this point the bluez bluetooth software should be compiled and installed on your PI. Shut down the Pi and add the bluetooth dongle that you bought, then fire up again.

The hciconfig tool should be available, as it is part of bluez. If you just type hciconfig, you will see the basic information of the bluetooth dongle that hopefully was detected. We will now perform three steps: enable the dongle, turn on advertisements and then turn off the scanning for devices (as it can interfere with the advertising).

sudo hciconfig hciO up
sudo hciconfig hci0 leadv 3
sudo hciconfig hci0 noscan

If this all completed successfully, we are now able to overwrite the Manufacturer Specific Data (MSD) for the advertising packet that is sent out. Adadruit did a great job explaining the parts of the MSD which I will not repeat here as they’ve done an excellent job.

sudo hcitool -i hci0 cmd 0x08 0x0008 1E 02 01 1A 1A FF 4C 00 02 15 E2 0A 39 F4 73 F5 4B C4 A1 2F 17 D1 AD 07 A9 61 00 00 00 00 C8 00

At this point, the BLE dongle should be advertising and you should be able to scan for the iBeacon. There are Apple iOS apps available to scan for beacons, but Android from 4.3 on is also able to run iBeacon scanners. I am using the iBeacon Locate app from Radius Networks – they btw also have an excellent Android library for scanning iBeacons.

The fun comes in, when you realize that you can change the MSD, meaning you can tweak the data in the advertisement packets. For example, you might want to increase the minor field, which is typically used to differentiate nodes within one location (e.g. the major field is used to differentiate the store location, the minor the location within a specific store). This can easily achieved by changing the minor version fields to this for example:

sudo hcitool -i hci0 cmd 0x08 0x0008 1E 02 01 1A 1A FF 4C 00 02 15 E2 0A 39 F4 73 F5 4B C4 A1 2F 17 D1 AD 07 A9 61 00 00 00 01 C8 00

iBeacon Locate If you have the iBeacon scanner app open and active, you’ll see how first a new beacon joins the list and then after some seconds the old beacon that is now no longer advertised is removed. Pretty cool!

It would be great to hear your stories around iBeacon – let us know what you think in the hybris Technology Google+ group.

Be the first to comment this post

Posted by - filed under Uncategorized.

You must have heard of Bitcoin, the digital currency without any central control. Bitcoin was already a big and exciting topic 1-2 years ago in the tech/nerd community and recently went mainstream in the media: from CNN to even German conservative Handelsblatt. Some see bitcoin as a hype, some purely as an investment option (like Gold, Silver, etc.) and some have high hopes in it to become the first global and friction-less payment system. For the sake of global ecommerce, I am hoping that it will not just be a good investment, but also a payment system – one that finally really works well for websites, mobile and at the POS.

I’ll outline a few features of bitcoin over the next paragraph until I’ll take a more technical twist to this and will show you how you can run your own transation-free (almost) payment system using Bitcoin.

First of all, Bitcoin exists since 2008 and was born out of a paper written by Satoshi Nakamoto. This is not a real person, it might be a single or a group of developers. Nobody knows them, which many consider a good thing because there is nobody to hold accountable for. In the recent years the Bitcoin Foundation took over the source code of the Bitcoin software (via GitHub) and continuously improves it.

So what are the characteristics of bitcoin:

  • Instant P2P transactions: ther is no middle man. No bank. People can send money from one account to any other bitcoin adddress. These addresses typically are presented as QR code or encoded onto an NFC tag and include a checksum – so it is hard to get it wrong.
  • Little transaction feeds: Credit card companies charge 2%, with bitcoin you pay a fraction. The fraction paid is a reward to all P2P nodes confirming the transaction. The more nodes confirmed the transaction the more unlikely it is the transaction does not go through. Typically 1-2 nodes that confirmed a transaction are enough for smaller payments. Larger payments can wait a longer time or increase the transaction fee to speek up the process.
  • Bitcoin is the first global currency. The same everywhere. Travellers love it.
  • Low risk of inflation: there is a total amount of 21 million bitcoins – ever. This surely has caused speculators to invest and is both a good and bad thing.
  • Bitcoin can be used anonymously: I think this is by far overrated as most people today have bought bitcoin through an exchange and have gone through ID processes. There are of course ways to send the money on to anomynous addresses, but it should be pretty tough for most people to figure a perfect system out.
  • It i super easy to get started: It depends how you want to use bitcoin, but in the most simple case you open an online account to accept and buy bitcoins or you download a free app (e.g. on Android) and are instantly ready to receive bitcoin. To get some, you’ll need a marketplace like bitcoin.de or to know someone who has some.

But speaking of ecommerce, why is this so exciting for shop owners – both online and offline:

  • Low transaction fees: even if you use a payment provider, e.g. BitPay, the transaction fees are radically lower than the 2% of a credit card company.
  • No reverse transactions: a transaction cannot be reversed. This one is big – retailers can still refund a returned item, but it is again a forward/new transaction. There is no way to cause troubles for retailers by calling the credit card companies and denying a 120 days old transactionPayment providers like BitPay take off the risk of converting Bitcoin into a local currency and depositing the amount into your bank account. You pay a fee for that. If you are more radical, you can explore to setup your own bitcoin server, using the openly available software.

I recently just did that:

Once you have the software running – use the GitHub link to get and compile it – you start the bitcoind process. It will quietly download the blockchain for a day or so and then you can use the system.

To get your default account address that was generated, type this:

bitcoind getaccountaddress ""
> 1JDa7ht2DgL1ai42hGTWRTkR3iM7XRyBeY

You should setup a new address (you can generate as many as you want) for each order. Very simple:

bitcoind getnewaddress ""
1DFCV6SkfehpYUstsjneU8yr41Epe6j9iq

Now transfer some money from another bitcoin wallet. Wait until the blockchain is updated:

bitcoind getreceivedbyaddress 1DFCV6SkfehpYUstsjneU8yr41Epe6j9iq
> 0.00010000

You can also check how many nodes have confirmed the transaction by adding another parameter stating the amount of nodes known to have confirmed the transaction. In the end, you can very easily transfer the balance to another wallet and from there on cash it out or convert to local currency.

Besides the command-line interface, bitcoind has a JSON-RPC style interface that is simple to use. I explored this using a small Groovy-based utility already and will post about it if you like (let me know).

If you don’t want to run your own bitcoin server you have the choice to use a bitcoin payment provider. Another smart choice might be to use something in between – e.g. blockchain.info offers nice callback options for your integration. Once funds are received, a callback informs your server about the received payment. You can then mark the order in question paid and continue the order fulfilment process.

We’ll have more news and discussion around bitcoin soon – for now I have to catch my flight from Beijing back to Germany. Just coincidentally, China is having most Bitcoin transactions globally and the largest marketplace. Baidu.com – the Chinese search giant is now accepting Bitcoin, too!

Be the first to comment this post