group_project_4

Brain-Computer Interfaces: Paint with your brain

Note: For a general introduction to BCI's, Emotiv and EEG please check out the BCI Wiki Page.

Our project Braint is about using generative art in combination with EEG. This Wiki about it will start by giving you an overview of what we did, what we used and where you can find all of it. After that we've collected some useful things we have learned during working on the project, further, we've uploaded the project's source files to GitHub, so you can have a deeper look into what we did and how we did it.

Braint Project Overview

Creativity Session

We started the project with a creativity session where we decided what we want the EEG to use for. Unfortunately we did not make a picture of the session, however, we have tried to create a short summary of the session.

Domains

BCI's have many application areas which you can all find described in the BCI Wiki Page.

The main areas we talked about during the session were:

* Brain Painting / Art

* Emotion Recognition / Affective Systems

* Neurofeedback

* Controlling systems with the brain (e.g. wheelchair, games)

Technology cards

Here is a list of most of the technology cards we used during the session:

* Brain painting

* Brain Bike Map

* Mind Controlled Virtual Reality:Rift + Hydra + EPOC

* Right path, Bicycle alert system that identifies when the cyclist intends to turn left/right and signals automatically

* Person of interest | Prevent misidentification using face recognition & EEG

* Hive mind | Collaborative decision making – all 5 team members and some judges moved a bee on the screen, together!

* Brainifeed | Subconcious content filtering, lets you automatically read your feed and stop on the stories your brain signals it’s interested in

* Shoni | Alert system upon distress, panic and falling asleep. This could be used for driving, elderly care, and many other uses.

* Date my brain | Matching individuals based on EEG pattern

* Activity tracker | Analytics to monitor your online activity

* INFinity | Neurofeedback with VR

* Brain-controlled wheelchair | for people that can’t move by using your thoughts you can direct the wheelchair Emotiv

* Nevermind: A biofeedback controlled online game

Characteristics

We choose to use EEG for creating some art and indentified some characteristics for this domain.

Brain Painting / Art:

  • EEG: brain waves, facial expressions
  • Emotions
  • Art/Creativity

Braint Project Process / Workflow

Here we describe shortly how the development process was, the tools/SDK's etc. mentioned here are introduced in the next chapter.

Week 1

After the project kick-off presentation we started to gather ideas on how to paint and we searched for libraries / tools we could use for that. We decided on using generative art and hence we had to decide on how to produce generative art and ended up with Processing 3 which is based on Java and also can be used in an IDE like Eclipse.

Week 2 & 3

The next week we split up the work and one focused on looking how to get the headset data with the SDK. To use the functionality of the Emotiv Epoc+ Headset we had to decide which of the provided SDK's we want to use and choose the C# (.Net) wrapper of the research SDK. The other focused on getting to know Processing and figuring out how to create some generative art.

To communicate between the applications we used OSC.

We then created the first prototype using the raw data of the headset in combination with an emotional state and agents (generative design).

Week 4 & 5

After the first prototype we were unsatisfied with the reliability of the headset's emotional states and facial expressions and hence decided to use OpenVibe to process the raw signal of the headset and determine alpha and beta powers (which is described in detail later). The application that gathered the data from the headset was completed and not used anymore due to headset's unreliability. However, as OpenVibe also provides the possibility to send via OSC it is possible to use OpenVibe and the C# application at the same time. We tuned the drawing with the agents a bit.

The problem during those weeks was that all of us were very busy with other projects leading to a even busier last buffer week.

Week 6

During the last week we tried to figure out what is the best way to map the brain data to the drawing parameters, we tested and played around with it a lot and ended up creating a GUI so the settings for drawing agents and more could be adjusted. This was very rushed and lead to quite poor GUI.

Main Tools

Emotiv Research SDK

Note: Due to the license we can not provide you with the a download link for the SDK. So you may ask your supervisor how to get it. However, if you do not need the raw EEG data you can use the free SDK.

The research SDK is written in C/C++ but also provides several wrappers. Basically you can use the SDK in any language as long as you can import the .dll files. The wrappers provided are for:

  • C# (.Net 4)
  • Java
  • Python
  • (Matlab)

We tried to use the Java wrapper, which did not work due to problems importing the .dll files of the SDK with JNA. We found a possible solution (described in this thesis) but simply decided to use the C# wrapper and send the headset data via OSC.

There are examples for their usage provided with the SDK.

Processing 3 & Generative Design

„Processing is a flexible software sketchbook and a language for learning how to code within the context of the visual arts.“

As it is written on the Processing 3 website, processing can be used for prototyping, creating art and other things. We used it as there are many examples of generative art available, especially we used the book Generative Design. It provides a lot of approaches for generative art, you can find the code and example videos on the book's website. Another useful website for processing is open processing where you can find many more art examples but also code examples for processing in general.

As mentioned above Processing is based on Java and hence can be used in an IDE like Eclipse. You are able to use any Java code in your processing applicaition. We provided a short introduction to Java and Processing in the next chapter.

OpenVibe for Signal Processing

With the OpenVibe software it is easily possible to do signal processing in real time. The software supports the Emotiv device, so you can get the raw signals from all of the electrodes and additionally you can get the data of the gyroscope. OpenVibe is a node designer, so basically you just have to create a signal processing chain and connect each node and finally send it to your applicaiont via OSC, TCP or even VRPN. You will need the .dll files from the research SDK for it to work.

To get started you can check out the documentation and tutorials from the OpenVibe website or read the the introduction given by us in the next chapter.

Braint: Things that might come come in handy for you

Here we will provide you some code snippets and in general some tips we found during working on the project.

Research SDK Exlporation (C#)

For a basic introduction to the SDK plase check it out at the BCI Wiki Page.

We have written a little console application which demonstrates how you can receive the raw data from the Emotiv device as well as emotional states and expressive states.

The C# solution consists of the following projects:

  • AffectiveStateLogger: Can be used if you want to log affective data from the headset or the EmoComposer to a .CSV to later send it with the Braint application as kind of simulation.
  • ExpressivStateLogger: Can be used if you want to log affective data from the headset or the EmoComposer to a .CSV to later send it with the Braint application as kind of simulation.
  • EmoEngineExploration: Just some testing went on here.
  • Braint: The main applicaiont that can connect to the headset and send the data via OSC or it can use recorded .CSV logs for simulation.
  • SharpOSC: The library used for OSC messeages.

You can find the solution here: Braint-Emotiv on GitHub

Important: You have to put the edk.dll and edk_util.dll in the folder where you compiled .exe files are e.g. in the Debug/Release folder.

Connecting to EmoEngine and Adding Listeners for EmoState updates

Basically you just need to connect to the EmoEngine add event listeners to the engine for the Affective and Expressive states. Again please look at the BCI Wiki Page for a first introduction.

// get the emo engine instance
public EmoEngine engine = EmoEngine.Instance;
 
// add event listeners for the events you want to listen to
                engine.EmoEngineConnected += new EmoEngine.EmoEngineConnectedEventHandler(engine_connected);
                engine.UserAdded += new EmoEngine.UserAddedEventHandler(engine_UserAdded_Event);
                engine.EmoEngineDisconnected += new EmoEngine.EmoEngineDisconnectedEventHandler(engine_disconnected);
                // this event is fired everytime the emotional states change
                engine.AffectivEmoStateUpdated += new EmoEngine.AffectivEmoStateUpdatedEventHandler(affectiveStateUpdate);
                // this event is fired everytime the expressive states change
                engine.ExpressivEmoStateUpdated += new EmoEngine.ExpressivEmoStateUpdatedEventHandler(epressiveStateUdpate);
 
                // connect to the engine
                engine.Connect();

In the following we will look at some examples of the listeners. But first we take a look at the raw signal

Raw EEG Signal with Emotiv

For the raw signal you simply need to enable the it in the code and get the data. The data comes as a dictionairy with channel names as key and double arrays as values which contain all gathered samples since the last request of data.

// this is the event lister when a user is connected, you need this to enable the data acquisition for a certain user
        void engine_UserAdded_Event(object sender, EmoEngineEventArgs e)
        {
            Console.WriteLine("User Added Event has occured");
 
            // record the user 
            userID = e.userId;
 
            // enable data aquisition for this user.
            engine.DataAcquisitionEnable(userID, true);
 
            // ask for up to 1 second of buffered data
            engine.EE_DataSetBufferSizeInSec(1);
 
        }
 
// this is the loop in which you gather the raw data from the headset
 
                void loopMe()
        {
            // get 1 second of data 
            engine.ProcessEvents(1000);
            // If the user has not yet connected, do not proceed
            // should not happen
            if (userID == 999)
                return;
 
 
 
            // here you get the data for a user
            Dictionary<EdkDll.EE_DataChannel_t, double[]> data = engine.GetData((uint)userID);
 
 
 
            if (data == null)
            {
                return;
            }
 
            // here you determine how much samples you have for each channel 
            int _bufferSize = data[EdkDll.EE_DataChannel_t.TIMESTAMP].Length;
 
            // outer loop for rows
            for (int i = 0; i < _bufferSize; i++)
            {
                // inner loop for columns
                foreach (EdkDll.EE_DataChannel_t channel in data.Keys)
                    // do something with data[channel][i]
 
 
            }
 
        }

Affective State Listener

Here you find an example on how to access the affective states. This is taken from the AffectiveStateLogger Project.

// add your own listener to the engine
engine.AffectivEmoStateUpdated += new EmoEngine.AffectivEmoStateUpdatedEventHandler(affectiveStateUpdate);
// implement your own listener
static void affectiveStateUpdate(object sender, EmoStateUpdatedEventArgs e)
        {
			// get the emo state
            EmoState es = e.emoState;
            float lastUpdate = time;
            float esTimeStamp = es.GetTimeFromStart();
            string systemTimeStamp = DateTime.Now.ToString("hh.mm.ss.ffffff");
            // Write the data to a file
            TextWriter file = new StreamWriter(filename, true);
 
            // "Timestamp,EmoState_Timestamp,BoredomScore,ExcitementShortScore,FrustrationScore," +
            // " MediationScore,ValenceScore,ExcitementLongShort,"
 
 
            if (e.userId == userID)
            {
                file.Write(systemTimeStamp + ";");
                file.Write(Convert.ToString(esTimeStamp) + ";");
                file.Write(es.AffectivGetEngagementBoredomScore() + ";");
                file.Write(es.AffectivGetExcitementShortTermScore() + ";");
                file.Write(es.AffectivGetFrustrationScore() + ";");
                file.Write(es.AffectivGetMeditationScore() + ";");
                file.Write(es.AffectivGetValenceScore()+";");
                file.Write(es.AffectivGetExcitementLongTermScore() + ";");
                file.WriteLine("");
 
 
 
                Console.WriteLine("Receiveing affective update .....");
 
            }
            file.Close();
 
        }

Expressive State Listener

Here you find an example on how to access the expressive data. This is taken from the ExpressiveStateLogger Project.

// add your listener to the engine
engine.ExpressivEmoStateUpdated += new EmoEngine.ExpressivEmoStateUpdatedEventHandler(epressiveStateUdpate);
// implement your listener
private static void epressiveStateUdpate(object sender, EmoStateUpdatedEventArgs e)
        {
            // get the em state
            EmoState es = e.emoState;
            if (e.userId == userID)
            {
                string header = "EmoState_Timestamp;" +
                    "LowerFaceAction;LowerFaceActionPower;UpperFaceAction;UpperFaceActionPower;" +
                    " ExpressivEyelidStateX;ExpressivEyelidStateY;ExpressivEyeLocationX;ExpressivEyeLocationY" +
                    "IsBlink;AreEyesOpen;IsLeftWink;IsRightWink;IsLookingLeft;IsLookingRight;IsLookingDown;IsLookingUp";
                // Wr;ite the data to a file
                TextWriter file = new StreamWriter(filename, true);
 
                file.Write(es.GetTimeFromStart());
                file.Write(";");
 
                EdkDll.EE_ExpressivAlgo_t lowerFaceAction = es.ExpressivGetLowerFaceAction();
                float lowerFaceActionPower = es.ExpressivGetLowerFaceActionPower();
 
                EdkDll.EE_ExpressivAlgo_t upperFaceAction = es.ExpressivGetUpperFaceAction();
                float upperFaceActionPower = es.ExpressivGetUpperFaceActionPower();
 
                file.Write(lowerFaceAction);
                file.Write(";");
                file.Write(lowerFaceActionPower);
                file.Write(";");
 
                file.Write(upperFaceAction);
                file.Write(";");
                file.Write(upperFaceActionPower);
                file.Write(";");
 
                // EYES
                float x, y;
                es.ExpressivGetEyelidState(out x, out y);
 
                file.Write(x);
                file.Write(";");
                file.Write(y);
                file.Write(";");
 
 
 
                float posX, posY;
                es.ExpressivGetEyeLocation(out posX, out posY);
 
                file.Write(posX);
                file.Write(";");
                file.Write(posY);
                file.Write(";");
 
                bool isBlink = es.ExpressivIsBlink();
                file.Write(isBlink);
                file.Write(";");
 
                bool areEyesOpen = es.ExpressivIsEyesOpen();
                file.Write(areEyesOpen);
                file.Write(";");
 
                bool isLeftWink = es.ExpressivIsLeftWink();
                bool isRightWink = es.ExpressivIsRightWink();
                file.Write(isLeftWink);
                file.Write(";");
                file.Write(isRightWink);
                file.Write(";");
 
 
 
                bool isLookingLeft = es.ExpressivIsLookingLeft();
                bool isLookingRight = es.ExpressivIsLookingRight();
                bool isLookingDown = es.ExpressivIsLookingDown();
                bool isLookingUp = es.ExpressivIsLookingUp();
                file.Write(isLookingLeft);
                file.Write(";");
                file.Write(isLookingRight);
                file.Write(";");
                file.Write(isLookingDown);
                file.Write(";");
                file.Write(isLookingUp);
                file.Write(";");
 
                file.WriteLine("");
                file.Close();
 
            }
 
 
 
        }

Train Cogntive Action & Facial Expressions

It is also easy to train the cogntive actions and facial expressions. The process is very good described in the manual with a sequence diagramm.

So basically you have to start the training in your application.

            engine.ExpressivSetTrainingAction(p.userID, EdkDll.EE_ExpressivAlgo_t.EXP_SMILE);
 
            engine.ExpressivSetTrainingControl(p.userID, EdkDll.EE_ExpressivTrainingControl_t.EXP_START);

After the start there is a two second delay until the recording of training data starts.

The ExpressivTrainingStarted event will be fired after those two seconds and the training will last 8 seconds. After that either the failed or succeeded event will be fired. So, you also have to listen to some of the events occurring during training.

            engine.ExpressivTrainingStarted += new EmoEngine.ExpressivTrainingStartedEventEventHandler(p.engine_expressiveTrainStarted);
            engine.ExpressivTrainingFailed += new EmoEngine.ExpressivTrainingFailedEventHandler(p.engine_expressiveTrainFailed);
            engine.ExpressivTrainingSucceeded += new EmoEngine.ExpressivTrainingSucceededEventHandler(p.engine_expressiveSucceded);
            engine.ExpressivTrainingCompleted += new EmoEngine.ExpressivTrainingCompletedEventHandler(p.engine_expressiveTrainCompleted);

The last event ExpressivTrainingCompleted is fired after you either rejected or accepted the training session in your application as seen below.

            engine.ExpressivSetTrainingControl(p.userID, EdkDll.EE_ExpressivTrainingControl_t.EXP_REJECT);
            engine.ExpressivSetTrainingControl(p.userID, EdkDll.EE_ExpressivTrainingControl_t.EXP_ACCEPT);

Other SDK features

We did not try out everything as we did not use the SDK for our final prototype anymore. However there are more possibilites: e.g. you could write a cognitive state listener in the same way as the ones for affective and expressive described above. It is also possible to save user profiles and even upload them to your emotiv account. This can be useful if you are training actions and want to save the user profiles with the trained data.

So for more possibilites and examples check out the user manual provided with the SDK.

SharpOSC: sending OSC messeages

We used SharpOSC for sending the data from the headset to the Java application.

Here is an example of how we send the raw data. Note that we get a bunch of data and not one sample when we gather data from the headset. Hence, to make the data as it was send in real time like it came from the headset we always computed the time difference of the current and next sample and wait that time before we send the next sample.

        UDPSender sender;
        public RawDataOSC(String address, int port)
        {
            sender = new SharpOSC.UDPSender("127.0.0.1", 12000);
        }
 
        public void sendRawDataNormalized(Dictionary<EdkDll.EE_DataChannel_t, double[]> data, string sendToOSCAdress)
        {
 
            int _bufferSize = data[EdkDll.EE_DataChannel_t.TIMESTAMP].Length;
 
            for (int i = 0; i < _bufferSize; i++)
            {
 
                // sending line by line
                OscMessage msg = new OscMessage(sendToOSCAdress, "");
                msg.Arguments.Clear();
                // now write the data
                foreach (EdkDll.EE_DataChannel_t channel in data.Keys)
                {
 
 
                    if (!channel.Equals(EdkDll.EE_DataChannel_t.COUNTER)
                        && !channel.Equals(EdkDll.EE_DataChannel_t.ES_TIMESTAMP)
                        && !channel.Equals(EdkDll.EE_DataChannel_t.FUNC_ID)
                         && !channel.Equals(EdkDll.EE_DataChannel_t.FUNC_VALUE)
                         && !channel.Equals(EdkDll.EE_DataChannel_t.GYROX)
                          && !channel.Equals(EdkDll.EE_DataChannel_t.GYROY)
                           && !channel.Equals(EdkDll.EE_DataChannel_t.INTERPOLATED)
                            && !channel.Equals(EdkDll.EE_DataChannel_t.MARKER)
                             && !channel.Equals(EdkDll.EE_DataChannel_t.RAW_CQ)
                              && !channel.Equals(EdkDll.EE_DataChannel_t.SYNC_SIGNAL)
                               && !channel.Equals(EdkDll.EE_DataChannel_t.TIMESTAMP))
                    {
                        double currentValue = data[channel][i];
                        //if (minSmaple > currentValue)
                        //    minSmaple = currentValue;
                        //else if (maxSamlpe < currentValue)
                        //    maxSamlpe = currentValue;
 
                        double normalized = getNormalizedValue(currentValue, maxSamlpe, minSmaple, maxScale, minSCale);
                        msg.Arguments.Add(normalized);
                    }
                    else {
 
                        msg.Arguments.Add(data[channel][i]);
                    }
                }
 
                // try to send measseag with appropiate time intervals
                // to simulate it as real input as good as possible
 
                sender.Send(msg);
 
                if (i < _bufferSize - 1)
                {
                    double waitTimeInSeconds = data[EdkDll.EE_DataChannel_t.TIMESTAMP][i + 1] - data[EdkDll.EE_DataChannel_t.TIMESTAMP][i];
 
 
                    double timeInMS = waitTimeInSeconds * 1000.0;
 
                    int sleepTime = Convert.ToInt32(timeInMS);
 
                    Thread.Sleep(sleepTime);
 
 
                }
 
            }
 
        }
 
        static public double getNormalizedValue(double value, double max, double min, double maxScaled, double minScaled)
        {
            return minScaled + (value - min) * (maxScaled - minScaled) / (max - min);
        }

Processing 3 / Java

Our Java application using the Processing libraries can be found here: Braint-Processing on GitHub

In the following we will show you some code examples of our project.

Using Processing 3 with Java

We will explain the Basics on how to use Processing in Java with our code example of the Agent-Method.

You can also find a tutorial on the Processing website to use it with Eclipse.

Basically you have to import the processing libraries into your java project. The libraries are located in „core\library“ where you installed processing. To get started just use the core.jar, however, if you want to use the OpenGl Renderes of Processing you have to add the other .jar files in the folder as well.

If you have another library specially for Processing you will also just have to add the .jar to your Java project.

Step 1 : Setup the PApplet

You have to create a Class which extends the PApplet, you can always use only one PApplet. In our first version oft he Agents the Class looked like this:

public class BraintAgentDraw extends PApplet {
 
    Agent[] agents = new Agent[10000];
    float strokeWidthScale;
    float noiseScale, noiseStrength;
    int rgb;
 
 
    public static void main(String args[]) {
        PApplet.main(new String[] { "--present", "Braint.BraintAgentDraw" });
    }
 
 
    public void settings() {
        size(1920, 1080);
    }
 
    public void setup() {
 
        for(int i = 0; i<agents.length; i++) {
            agents[i] = new Agent(this);
        }
        strokeWidthScale = 0.3f;
        noiseStrength = 10f;
        noiseScale = 300f;
        rgb = 0xff2b2b2b;
    }
 
    public void draw(){
 
 
//        if (drawMode == 1) {
            for(int i = 0; i < agents.length; i++) {
                agents[i].update1(this);
                /*
                if (i < 2500)  noiseScale = 1;
                else if (i >= 2500 && i < 5000) noiseScale = 0.666f;
                else if (i >= 5000 && i < 7500) noiseScale = 0.333f;
                else noiseScale = 4;
                */
            }

You always need the main-method.

In the settings-method you can set the screen resolution. You could also set this to fullscreen(). In the setup-method you setup all the things you want to initalize before the first drawing. In our case we initialize the Agents and set the Parameters for those. The draw-method is a loop, which will be endlessly repeated until you stop it. Here we just update every Agent.

Step 2 : Add another class which should be used while drawing In our example we will add the Class Agents:

public class Agent {
    public PVector p, pOld;
    public float stepSize, angle;
    public boolean isOutside = false;
 
    public Agent(BraintAgentDraw bad) {
 
        p = new PVector(bad.random(bad.width), bad.random(bad.height));
        pOld = new PVector(p.x, p.y);
        stepSize = bad.random(1, 5);
        //stepSize = 0.1f;
    }
 
    public void update1(BraintAgentDraw bad) {
 
        //alternative
        angle = bad.noise(p.x / bad.getNoiseScale(), p.y / bad.getNoiseScale()) * bad.getNoiseStrength();
 
        //angle = bad.noise(p.x, p.y);
        p.x += bad.cos(angle) * stepSize;
        p.y += bad.sin(angle) * stepSize;
 
        //System.out.println("(" + pOld.x + "," + pOld.y + ")" + " -> " + "(" + p.x + "," + p.y + ")" + "  Angle = " + angle);
        //System.out.println(bad.cos(angle));
        //System.out.println(bad.noise(p.x / bad.getNoiseScale(), p.y / bad.getNoiseScale()));
 
 
        if(p.x < -10) isOutside = true;
        else if (p.x > bad.width+10) isOutside = true;
        else if (p.y < -10) isOutside = true;
        else if (p.y > bad.height+10) isOutside = true;
 
 
        if(isOutside) {
 
            p.x = bad.random(bad.width);
            p.y = bad.random(bad.height);
            pOld.set(p);
 
        }
 
        float strokeWidth = 1;
        bad.strokeWeight(strokeWidth*bad.getStrokeWidthScale());
        bad.stroke(bad.getRGB());
        bad.line(pOld.x, pOld.y, p.x, p.y);
        pOld.set(p);
        isOutside = false;
 
    }

Here we just normaly setup a Java Class. The only important thing here is, that you have to give every instance of the class a PApplet. Than you can just call every Processing-method with the PApplet.

In our example, we give every Agent coordinates. The update function calculates the angle with which the Agents should move according to a noise function and depending on the variable noisescale and noisestrength. Then we calculate the new coordinates for the Agent. If the Agent is out of the screen it will be placed somewhere new. After the calculation of the new coordinates, a line will be drawn from the old coordinates to the new ones.

oscP5: Processing OSC Library

We used oscP5 for receiving the messeages send from OpenVibe or the C# Application.

Here is a basic example of the usage of the library. We added the listener (OscP5) to our main PApplet and delegated the messages either to the OpenVibeHandler or EmoEngineHandler.

// this = a PApplet
OscP5 oscP5 = new OscP5(this, BigSettings.instance().OSC_PORT);
 
// you just have to implement this method if you added a OscP5 to your  PApplet
public void oscEvent(OscMessage theOscMessage) {
 
		String msgAddr = theOscMessage.addrPattern();
 
		if (msgAddr.contains(BraintUtil.OSC_OPENVIBE_ALPHA) || msgAddr.contains(BraintUtil.OSC_OPENVIBE_BETA)) {
 
			openVibeAlphaBetaPower.handleOSCMessage(theOscMessage);
 
		} else if (msgAddr.contains(BraintUtil.OSC_EMO_ENGINE)) {
 
			if (emoEngine == null)
				return;
 
			try {
 
				emoEngine.handleOSCMessage(theOscMessage);
			} catch (Exception e) {
 
				e.printStackTrace();
				theOscMessage.print();
				System.out.println(emoEngine == null);
 
			}
 
		}

Here is an example of how we receive the data sent from the C# application. As you can see here the good thing about OSC is that you can easily convert the bytes to the correct data types.

@Override
	public void handleOSCMessage(OscMessage msg) {	
		if (msg.checkAddrPattern("/emoengine/affective")) {
 
			/*
			 * typetag sfdddddd 
			 * [0] emotivTimeStamp, boredom, excitement,
			 * frustration, mediation, valence, excitementLongTerm 
			 * [1] 40.09696
			 * [2] 0.5487005114555359 
			 * [3] 0.0 [4] 0.7110533118247986
			 * [5] 0.3333112597465515 
			 * [6] 0.625 
			 * [7] 0.0
			 */
 
			// the osc message has a list of values sent
			// above you can see the typetag "sfdddddd" which you can use 
			// to make sure you're doing the right conversion
 
			affState.timeStamp = msg.get(1).floatValue();
			affState.boredom = (float) msg.get(2).doubleValue();
			affState.excitement = (float) msg.get(3).doubleValue();
			affState.frustration = (float) msg.get(4).doubleValue();
			affState.mediation = (float) msg.get(5).doubleValue();
			affState.valence = (float) msg.get(6).doubleValue();
			affState.longtermExcitement = (float) msg.get(7).doubleValue();
 
		}
	}

controlP5: Processing GUI library

Control P5 is a library for creating a GUI. It supports annotations for creating GUI elements and if you name a function the same way as the variable you anotate, you already have a listener for it. The example below shows the slider for changeing the calibration time.

	@ControlElement(x = 0, y = 0, label = "calibHeading", properties = { "type=textlabel" })
	String CalibrationHeading = "Calibration with OpenVibe";
 
	@ControlElement(properties = { "min=5",
			"max=60" }, x = 0, y = MainGui.default_GUI_Element_Height, label = "Calibration Time (seconds)")
	public int calibrationTime = BigSettings.instance().CALIBRATION_TIME;
 
	public void calibrationTime(int value) {
 
		BigSettings.instance().CALIBRATION_TIME = value ;
		calibrationTime = value;
	}

If you download the library there are many good examples to learn how to use the library.

Apache Commons Math: Java Statistics Library

For computing descriptive statistics of the alpha and beta power we used the commons math library.

Here is a short example on how to use it:

// create a summary object where all data samples are stored in
SummaryStatistics alphaStatisticsSummary = new SummaryStatistics();
// add values to it during calibration or similar
alphaStatisticsSummary.addValue(currentAlpha);
 
// when you have added all the values you can simple get the mean, variance etc. from the summary object
alphaStatisticsSummary.getMean();
alphaStatisticsSummary.getStandardDeviation();

Simple XML Serialization

With Simple XML you can serialize classes using annotations

Here is a basic example for the annotations:

@Root(name = "root")
public class BigSettings {
 
	@Element(name = "colorMode")
	public int colorMode = 1;
 
}

For serialization and deserialization we also have an example:

public class XMLUtil {
 
 
 
	public static <T> void serializeObject(T objectToSerialize, String fileName) {
 
		Serializer serializer = new Persister();
 
		String filename = fileName;
 
		if (!filename.contains(".xml"))
			filename += ".xml";
 
		File result = new File("./settings/" + filename);
 
		if (!result.exists()) {
			try {
				result.createNewFile();
			} catch (IOException e1) {
				// TODO Auto-generated catch block
				e1.printStackTrace();
			}
		}
 
		try {
			serializer.write(objectToSerialize, result);
		} catch (Exception e) {
			// TODO Auto-generated catch block
			e.printStackTrace();
		}
	}
 
	public static <T> T deserialzeObject(T type, String fileName){
 
		Serializer serializer = new Persister();
 
		String filename = fileName;
 
		if (!filename.contains(".xml"))
			filename += ".xml";
 
		File source = new File("./settings/" + filename);
 
		T obj = null;
 
		try {
			obj = serializer.read(type, source);
		} catch (Exception e) {
			// TODO Auto-generated catch block
			e.printStackTrace();
		}
 
		return obj;
 
	}
 
}

OpenVibe Scenario

You can find the scenarios we used in the here. One we used for testing the other one is used for when connected with the headset.

Introduction to OpenVibe

The OpenVibe Software bascially consists of two parts; the acquisition server and the designer. The first is used to connect to your EEG device - in our case the Emotiv headset - and second is used to create your signal processing chain.

On the pictures you can see the Acquisition Server where you can select the device and set its properties. You then click on connect to connect to device and afterwards click play to start sending the device data to the designer.

Below you can see a basic example created with the designer. Here the acquisition client is used to receive the data from the acquisition server and it is displayed via the signal display box.

Note: some of the boxes / nodes available in the designer are only visible if you check the 'unstable' checkbox.

Setting up Emotiv & OpenVibe Acquistion Server

When you want to connect the Emotiv with the acquisition server it will tell you to set the path to the edk.dll in the driver properties; however, the documentation seems to be really bad and we only got it working when you copy the edk.dll & edk_utils.dll to the binary folder of OpenVibe (.\openvibe\bin).

Once the dll files are in the right place it should work just fine and you can receive the raw data in the OpenVibe designer.

Example of extracting Alpha & Beta Power from the raw signal

Don't know what alpha and beta waves are and to what they are related? Then check it out here: BCI Wiki Page.

In the picture below you can see the basic signal processing chain.

What you have to do is basically select the channels which you want to use for the power computation and then filter the frequencies with the temporal filter box (e.g. aplha 7-13 Hz and beta 16-24 Hz). Now you have the signal filtered by frequencies; however, as the power describes how much activity of a frequency band there is, you need to accumulate some data over time. This is done with the time based epoching box: it gathers data for 1 second (you can adjust that) and starts a new epoch every 0.1 second (you can adjust that as well). Then you have some accumulated data you can average. To avoid that positive and negative values are added and eventually become 0 the result of the epoch data accumulation is squared using the simple DSP box. Those values are then averaged and then again averaged in the epoch average box where always the last 4 epochs are used to compute a average. But now it is done! You have the power of a frequency band in real time.

Keep in mind that this chain computes the power for each of the selected channels from the beginning, it remains to you if you use all power values and e.g. average them to get one value or if you use the power of each channel.

Finally, you should send the value you computed to your application for which we used the OSC controller box. You can only send the value of one single channel via OSC. If you want to send all of the values at once you will need to use the TCP socket. Unfortunately you will have to deal with the byte stream in your application then.

OpenVibe TCP Socket

If you want to use the TCP box here is some c# code that reads the header and then stores the signal values in an array. You can find a description in the documentation of the TCP box of how the signal byte stream looks like. The code is not tested much as we decided to use OSC, but reading the header works correctly.

Note: Also check out the Box Documentation of the TCP Wrtier as the byte stream is described there.

    // reading the TCP byte stream example
 
    public bool testHeader = true;
    public bool testSignal = false;
    public bool isString = false;
    public int testSampleChannelSize;
    public int testSampleCount;
    public int testChannelCount;	
    public double[,] lastMatrix;
    public float power;
    public RawOpenVibeSignal lastSignal;
 
    public class RawOpenVibeSignal {
 
        public int channels;
        public int samples;
 
        public double[,] signalMatrix;
    }
 
 
	private void readHeader()
    {
        // size of header is 8 * size of unit = 32 byte
 
        int variableSize = sizeof(UInt32);
        int variableCount = 8;
 
        int headerSize = variableCount * variableSize;
 
        byte[] buffer = new byte[headerSize];
 
        theStream.Read(buffer, 0, headerSize);
 
        // version number (in network byte order)
        // endianness of the stream (in network byte order, 0==unknown, 1==little, 2==big, 3==pdp)
        // sampling frequency of the signal, 
        //  number of channels, 
        // number of samples per chunk and 
        // three variables of padding
 
 
        UInt32 version, endiannes, frequency, channels, samples;
 
        byte[] v = new byte[4] { buffer[0], buffer[1], buffer[2], buffer[3] };
        byte[] e = new byte[4] { buffer[4], buffer[5], buffer[6], buffer[7] };
        byte[] f = new byte[4] { buffer[8], buffer[9], buffer[10], buffer[11] };
        byte[] c = new byte[4] { buffer[12], buffer[13], buffer[14], buffer[15] };
        byte[] s = new byte[4] { buffer[16], buffer[17], buffer[18], buffer[19] };
        if (BitConverter.IsLittleEndian)
        {
            Array.Reverse(e);
            Array.Reverse(v);
            version = BitConverter.ToUInt32(v, 0);
            endiannes = BitConverter.ToUInt32(e, 0);
            frequency = BitConverter.ToUInt32(f, 0);
            channels = BitConverter.ToUInt32(c, 0);
            samples = BitConverter.ToUInt32(s, 0);
        }
        else
        {
 
            version = 999;
            endiannes = 0;
            frequency = 0;
            channels = 0;
            samples = 0;
        }
 
        testHeader = false;
        testSampleCount = buffer[16];
        testChannelCount = buffer[12];
        testSampleChannelSize = buffer[12] * buffer[16] * sizeof(double);
        testSignal = true;
    }
 
    public double readSocket()
    {
        if (!socketReady)
            return 0; // TODO
        if (theStream.DataAvailable)
        {
            // read header once
            if (testHeader)
            {
                readHeader();
 
            }
 
            if (testSignal)
                {
                // raw signal data
                // [nSamples x nChannels]
                // all channels for one sample are sent in a sequence, then all channels of the next sample
 
                // create a signal object to send it to another
                RawOpenVibeSignal newSignal = new RawOpenVibeSignal();
                newSignal.samples = testSampleCount;
                newSignal.channels = testChannelCount;
 
                double[,] newMatrix = new double[testSampleCount,testChannelCount];
 
 
                byte[] buffer = new byte[testSampleChannelSize];
 
                theStream.Read(buffer, 0, testSampleChannelSize);
 
                int row = 0;
                int col = 0;
                    for (int i = 0; i < testSampleCount * testChannelCount * (sizeof(double)); i = i + (sizeof(double) * testChannelCount))
                    { 
                        for (int j = 0; j < testChannelCount * sizeof(double); j = j + sizeof(double))
                        {
 
                            byte[] temp = new byte[8];
 
                            for(int k = 0; k < 8; k++)
                            {
                                temp[k] = buffer[i + j + k];
                            }
 
                        if (BitConverter.IsLittleEndian)
                        {
                           // Array.Reverse(temp);
                            double test = BitConverter.ToDouble(temp, 0);
 
                            // TODO TEST THIS
                            //newMatrix[i / (8 * testChannelCount), j / 8] = test;
                            newMatrix[row, col] = test;
                        }
                        col++;
 
                        }
                    row++;
                    col = 0;
                    }
 
                newSignal.signalMatrix = newMatrix;
                lastSignal = newSignal;
                lastMatrix = newMatrix;
 
 
                return newMatrix[0, 0];
            }
                else if (isString) {
                    // TODO
                }
 
            }
            return 0;
 
    }

Emotiv Epoc+ Headset Tips & Problems

Problematic Electrodes

With the headset we used we especially had problems with electrode T7 and T8, and without them working the emotional state detection and in general the headset's reliability is quite poort which lead to only using the raw data from the other electrodes and OpenVibe.

In the user manual they give you following advise:

One or both of the sensors immediately adjacent to the ears remains black.
• These sensors are located on the main body of the Arm assembly, closest to the arm pivot
point. They detect activity in the temporal lobes and are known as T7 (left side) and T8
(right side). A combination of the shape of the arm assembly and the user’s head shape
(particularly long, narrow heads with relatively flat sides) can sometimes result in these
sensors failing to touch the head, being held off by some of the other sensors.
• Check that the sensors are clean and attached properly as per the general comments in
the next section
• Check that the sensors are clean and attached properly as per the general
• Remove the RUBBER COMFORT PAD including the plastic holder, from the side or sides
where the contact cannot be achieved. The neuroheadset can be worn comfortably without
these pads for people with this head shape, and no harm will come to the connector
sockets because they are fully enclosed. The change in balance point is usually sufficient
to ensure contact occurs.
• In the unlikely event that contact is still impossible to obtain, you can use a longer felt pad
or use a cotton ball soaked in saline to fill the gap or replace the felt piece.

Cleaning & Saline Solution

Saline Solution

At some point we had to replace the saline solution as it was getting empty. The picture on the right shows the content of the shipped solution. You can basically use cleansing solution of contact lenses that does not need be washed out with water. You can find a description in the manual but also we found this in the forums:

The saline requirements are described in general in the Hardware Setup Guide, however here are some more details: 

You can use normal saline rinsing solution, or more general purpose rewetting solutions. 
Don't use the pure disinfecting solutions which need to be washed out of the lens before 
they can go back in the eye, because these may also cause skin irritation. 
Isotonic saline will work fine. We need a salt content approximately 0.75% to 4% w/w in purified water. 
Typical lens solutions are around 0.9% saline. 
Look for a solution which also contains some antibacterial material and preservatives, 
because these will minimize cross-contamination risk and microbial load in the felt pads 
if they stay wet or are shared between users. 
Typical preservatives include EDTA, boric acid and a range of commercial preservatives 
which are approved for use in the eye as this makes the chance of skin irritation extremely low. 

If you use a pure saline which is sold at the pharmacy to dilute injections (for example) 
it would be a good idea to add a small quantity (about 1 part in 100) of first-aid disinfectant 
such as 70% isopropyl alcohol. 
This is not quite as good as commercial lens solution in term of bug control 
but works perfectly for electrical conduction. 
Cleaning

During usage the electrodes get oxidated and you will see „green stuff“ on them. Partly it is important to remove it so the signal is not noisy.

On the picture above you can see how the electrodes look like and some marked spots. The green circles are the spots you should keep clean (the dome) by using cotton buds and alcohol. The blue part is also often very dirty but as far as we understood it, it is not necessary to remove it (we are also not completly sure if you should use alcohol and cottons buds for it). The felt (red rectangle) should be taken out before cleaning, but this is explained in the cleaning guides we provide the links for. Attention Never ever use anything sharp to remove the oxidated salt.

Here you can see how it looks if you take out the felt. Attention Never ever try to remove the oxidated stuff there with a cotton bud or something similar, as there is a special paste on the electrode which ensures a low noise signal.

There is a detailed the description for cleaning at the Emotiv Zendesk.

Here is a small part of the description:

Users will need to maintain the contacts to keep them operating like new. 
You will need the following items from your local pharmacy or drug store:

* Quantity of 0.4% to 0.9% saline.

* Small quantity (50 mil) of pure isopropyl alcohol.

* Six good quality cotton (ear) buds.

Following the instructions provided by EMOTIV, remove all of the contacts from the headset arms (or fingers).
Remove the felt pads and soak them in a flat dish with about 2 ml of saline on the bottom. 
While the felt pads are soaking, wet a cotton bud with the alcohol 
and rub only the rear of the gold contact in the felt pad holder.

In the forum there are two topics dealing with cleaning and explaining what is needed to clean. Topic 1 Topic 2

Here are questions of a user that were answered by a Emotiv Admin:

User's Questions:

1) Is it dangerous to remove the felt pads ? When I try to remove them, I feel a resistance, 
as if they were stuck/glued in the plastic holder. 
If I pull too strongly, is there a risk to damage the gold layer below ? 

2) Is it necessary to put the felt pads in salty water ? 
I guess it would be more effective to dissociate the salts in pure water first and in salty water at the end. 

3) If I got it right, only the dome part of the contact plate should be cleaned ? 
The one in contact with the connector of the arm of the headset. 
Is there a good way to clean this part ? 


Admin's Answer:

1. The felt pads are held in place by sharp fingers inside the sensor body. 
It does not damage the sensor plate or body to pull the felt pads out. 
There is minimal damage to the pads, they seem to survive quite ok. 
2. You could certainly clean out the residual salt with clean fresh water. 
It makes not a lot of difference, low concentration saline also works fine, 
there is not usually enough salt stored in the felt pad to affect the overall saline concentration in a soaking jar. 
3. Correct. Don't touch the opposite face. 
You can soak the plates in fresh water to soften the solids, then return them to the holder, 
or hold them in some other way to avoid touching the front, and use a soft toothbrush 
or a toothpick to remove the solid build-up

Braint: The final prototype

Agents

Agents are Java Objects which only exists of coordinates. So normaly when you update the Agent it will change its coordinates, then you will just do something depending on the old coordinates and the new ones. In our case we draw lines from the old ones to the new ones. And change the movement of the Agents according to some parameters. A more detailed description can be found under the point Using Processing 3 with Java in the documentation above.

The Parameters you can use to change the way the Agents move and look are the following:

NoiseScale:

The higher this value, the more smooth/straigth the movement of the agent will look.

NoiseStrength:

The higher this value, the more complex the movement will look.

StrokeWidthScale: This Value modifies the standard strokeWidth parameter. The higher it is, the wider the stroke will be drawn.

StrokeStepSize:

This Value can be used to modifie the range in which the new coordinates for the agents should be calculated. A higher value means the new coordinates are placed further away from the old ones. We don't use this Parameter in our BraintApp.

RGBValue:

This is a Hex-Value which states the colour in which the newest line should be drawn. You could change that Value depending on something over time, or just give different agents different colours.

Some Examples of different scale and strengthg values

Braint App

Our Processing Application consists of a simple GUI where you can adjust several parameters. We will explain the usage in the following briefly.

It uses the alpha and beta power which is computed with the OpenVibe software. So before usage make sure OpenVibe is running and is connected to the headset.

There are several parts:

  • Electrode Selection
  • Calibration
  • Descriptive Statistics of Calibration
  • Thresholds for power values
  • Thresholds for agent parameters
  • Draw Methods
  • Drawing parameters unrelated to the power values
  • Mapping of Alpha / Beta Power Values to the Agent Parameters
Electrode Selection

Here you can select the electrodes that should be relevant for the alpha/beta power computation during drawing. Make sure to restart the calibration process after you changed the electrodes as the statistics are copmuted for the selected electrodes only.

However, below we will describe our two drawing methods, for the second every electrode is used unrelated to what you have selected.

If you want to use alpha and beta power at the same time, it makes sense to use the same electrodes for both powers.

Calibration & Statistics

Before you start drawing you should calibrate which basically means to compute the alpha / beta power statistics during a neutral state of the user with the electrodes you selected.

Based on the statistics you should adjust the thresholds for the power values . For instance, we always tried to use mean + 3 * variance as upper threshold, however, if there is a very high variance it does not make sense.

Thresholds for power values

Here you set the thresholds for the maximum and minimum values the power can have, that means the power values will be cropped at those values. You can set the power thresholds for each of the three agent parameters: strength, scale and color. The power values will be scaled to values that are suitable for those three parameters.

Thresholds for agent parameters

Here you define the ranges which the power values get scaled to. Note that the color value at the moment should be 0 to 1.

Power Mapping to Agent Parameters

Here you can define which power you want to use for which parameter. You can choose alpha for all three parameters for example. If the„Reversed“ option is not checked the power values will be scaled like the following: e.g. low alpha power will be scaled to a low strength value. If it is selected then a low power value will be mapped to a high strength value.

What would be good mappings for drawing?

Good mappings for example would be, to map the Alpha-Power reversed on the NoiseScale, because higher Alpha-Power means you are more relaxed and so the Picture will look more smooth/straigth because the lines mainly move only in one direction instead of a lot of different directions.

Mapping the Beta-Power to the noiseStrength is also a usefull mapping, because higher Beta-Power means more concentrated or more intense thinking and so the picture will look more complex while you are calculating for example some difficult math tasks.

Draw Methods & General Parameters

Here you can select which method you want to use, both are explained at the end.

You can select how much agents you want to use for drawing (the value selected here is multiplied by 120 in the programm).

You can also select the line size of the agents (stroke width).

The background color on which you want to draw on can also be adjusted.

The last option is only for method 1, here you can choose different color mappings.

Method 1

In Method 1 we map either the Alpha-Power or the Beta-Power of choosen Electrodes to everything. That means according to changes in the Alpha-Power or the Beta-Power noiseScale/Strenght and the colour will change. For the colour one can choose between different colormappings , which means when the power increases or decreases the colour will get more saturated/darker/brighter etc. but will stay for example blue or green.

Method 2

In Method 2 we map 12 of our Electrodes to Agents, every Electrode gets mapped to a twelth of the Agents and all those Agents will get their own colour. So for Example: We initialize 120 Agents, the Electrode 1 gets mapped to the first 10 Agents, which will be blue, Electrode 2 will get mapped to the Agents 11-20 which will be green, and so on. The Agents now move only according to the Alpha-Power or Beta-Power of their responsive Electrode.

Braint Git Repositories

OSC Message Addresses / Ports

OpenVibe Adresses

It sends alpha and beta power of each electrode on the local machine (127.0.0.1) on port 12000. There is one value for each electrode you need to receive.

Alpha:

  • /openvibe/alpha/o1 (float)
  • /openvibe/alpha/o2 (float)
  • /openvibe/alpha/af3 (float)
  • /openvibe/alpha/af4 (float)
  • /openvibe/alpha/f3 (float)
  • /openvibe/alpha/f4 (float)
  • /openvibe/alpha/f7 (float)
  • /openvibe/alpha/f8 (float)
  • /openvibe/alpha/fc5 (float)
  • /openvibe/alpha/fc6 (float)
  • /openvibe/alpha/p7 (float)
  • /openvibe/alpha/p8 (float)

Beta:

  • /openvibe/beta/o1 (float)
  • /openvibe/beta/o2 (float)
  • /openvibe/beta/af3 (float)
  • /openvibe/beta/af4 (float)
  • /openvibe/beta/f3 (float)
  • /openvibe/beta/f4 (float)
  • /openvibe/beta/f7 (float)
  • /openvibe/beta/f8 (float)
  • /openvibe/beta/fc5 (float)
  • /openvibe/beta/fc6 (float)
  • /openvibe/beta/p7 (float)
  • /openvibe/beta/p8 (float)

Braint C# Adresses

It sends affective, expressive and raw data on the local machine (127.0.0.1) on port 12000.

We will shortly explain how the OSC messages look like.

Affective OSC:

Adress: /emoengine/affective

Message in correct order:

  • Header: „emotivTimeStamp, boredom, excitement, frustration, mediation, valence, excitementLongTerm“ (string)
  • Emotiv Time Stamp (float)
  • Boredom (double)
  • Excitement (double)
  • Frustration (double)
  • Mediation (double)
  • Valence (double)
  • Excitement Longterm (double)

Expressive OSC:

Adress: /emoengine/expressiv

Message in correct order:

  • Header: „EmoState_Timestamp, LowerFaceAction,LowerFaceActionPower,UpperFaceAction,UpperFaceActionPower , ExpressivEyelidStateX,ExpressivEyelidStateY,ExpressivEyeLocationX,ExpressivEyeLocationY,IsBlink,AreEyesOpen,IsLeftWink,IsRightWink,IsLookingLeft,IsLookingRight,IsLookingDown,IsLookingUp“ (string)
  • Emotiv Time Stamp (float)
  • Lower Face Action (string)
  • Lower Face Action Power (float)
  • Upper Face Action(string)
  • Upper Face Action Power (float)
  • Eyelid State X (float)
  • Eyelid State Y (float)
  • Eye location X Position (float)
  • Eye location Y Position (float)
  • Blink Occurred (boolean)
  • Are Eyes Open (boolean)
  • Left Wink Occured (boolean)
  • Right Wink Occured (boolean)
  • Looking Left (boolean)
  • Looking Right (boolean)
  • Looking Up (boolean)
  • Looking Down (boolean)

Raw EEG Data:

Adress: /emoengine/rawEEG

Message in correct order:

  • Counter (double)
  • Interpolated (double)
  • Raw_CQ (double)
  • AF3 (double)
  • F7 (double)
  • F3 (double)
  • FC5 (double)
  • T7 (double)
  • P7 (double)
  • O1 (double)
  • O2 (double)
  • P8 (double)
  • T8 (double)
  • FC6 (double)
  • F4 (double)
  • F8 (double)
  • AF4 (double)
  • Gyroscope X (double)
  • Gyroscope Y (double)
  • Timestamp (double)
  • EmoState Timestamp (double)
  • Func ID (double)
  • Func Value (double)
  • Marker (double)
  • Sync Signal (double)
group_project_4.txt · Zuletzt geändert: 2018/12/03 09:43 (Externe Bearbeitung)