Move up to modern de-artifacting

There is an on-going dispute regarding de-artifacting methods used in qEEG.  Though there are vested interests counseling against the use of modern techniques to remove artifact while leaving the underlying EEG intact, there are also those who have specialized in the area that can provide a detailed reply to the vested interests.  Just such a reply was posted recently in a commercial list server, and we got the author’s permission to re-post the discussion on the website in a non-commercial publicly accessible form for all to see.

It specifically points to the fact that the phase changes seen are due to removal of artifact, not the distortion of the underlying EEG, which has residual subtle artifacts remaining if processed with classical approaches.

If you cut time segments out of the EEG to remove artifacts, you also remove the underlying connectivity information, splicing discontinuous microstates together destroys the underlying time series.

In the give and take of the real world of neuroscience, the need to provide a valid time-series showing the connectivity of the neural networks, yet free of artifact, is driving the need to switch to more modern techniques than snipping out segments of time.  If you want to distort the timeline of the EEG (phase) just cut and paste lots of EEG together in one second chunks.

The neuroscience community will undoubtedly continue to discuss these issues, but the need for clean valid EEG is driving the field to these newer techniques, and they are performing well under the scrutiny.

Jay Gunkelman

————— from Dr. Zeman, B.Eng (B.Sc.) 2000, Ph.D. 2009 (Interdisciplinary: Engineering, Biology, Psychology)

There seems to be quite a bit of controversy around our new artifact removal process and I would be happy to answer any questions you might have. We’ve been developing data cleaning methods as a first step in technology we have partnered to develop: EEG assessment of Schizophrenia and Bipolar Disorder, and early detection of Alzheimer’s and differentiation from MCI. Since we need to reliably assess these types of conditions, we need the best possible estimates of brain activity. Since the existing methods do not meet our needs, we have used our skills and experience to create something that does. I am sure many of you would agree that the project of improving the detection of brain function via EEG is both worthy and challenging. It is not surprising that controversy arises.

I invite you to support our effort in getting robust and reliable data cleaning methods out into the world so that we can all get more from our data. You can do this by asking questions to us directly and by giving us feedback as necessary. My personal email address is

I will address a number of points in this email about data cleaning but first I should address a point of confusion in the previous public email. In the public email, Bob Thatcher stated, ‘It is correct that Phil Zeman used reconstruction methods that are not ICA, nonetheless, tests of the method showed that it distorted phase and differences between channels. I submitted test files to Phil and he returned so-called “artifact corrected” files and all of the tests showed distorted phase differences.  To our knowledge, Bob has not tested the algorithms we currently have on our server. That said, I think I know the source of the confusion.

Bob is absolutely correct in that nearly 2 years ago I proposed a method for removing artifacts based on Daubechies wavelets. By design, that particular algorithm ‘touches’ the data only in locations where artifact is found and changes the phase of those locations to more accurately represent the underlying brain activity. It touched the data in places where Bob did not expect an algorithm to find artifact. It was a good algorithm however it wasn’t as robust as we needed to meet our goal and left too much residual artifact in the data. Hence, we have developed a new unique algorithm that meets our needs. With regards to changing the phase of the data, the original Daubechies wavelet algorithm did indeed change the phase relationships between channels at locations where artifact was subtracted, with the goal of course being, to reveal the underlying brain activity. (We have posted our original response to Bob’s comments that were made nearly 2 years ago. See file “zeman-rebuttaltobobthatcher.pdf” available from our FAQ page:

Since artifact itself changes the phase of scalp recordings, the subtraction of artifact would change the phase back to better reflect the brain activity contained in the EEG.

In summary, (1) to our knowledge Bob has not yet tested the data cleaning algorithms that we currently offer for public use and, (2) a change in phase does not necessarily mean the brain activity contained in the EEG data is distorted. I invite Bob to test our latest algorithms and offer any criticism and comments directly to our actual email addresses so that we are sure to receive them. We appreciate any feedback so that we can make improvements as necessary. I think we all have the same goal which is to get the most of our data and I’m sure we would all like to do this with as much accuracy and confidence as possible. I am confident that what we are currently offering is the best available. Simple statistical thresholding does not target specific artifacts, ADJUST only identifies which ICA components contain artifact, convolution methods do not adapt as necessary to multiple datasets and non-stationary artifacts, and the naive use of ICA on low dimensional data risks removing brain activity when a complete component is subtracted.


To further elaborate on the statement that a change in phase (measured by coherence) does not necessarily mean brain activity is distorted, I have written a short tutorial with figures. I have also included the Matlab/Octave code if you would like to play with the model in the tutorial. That tutorial is available on our website. (See ‘Phase Coherence Changes in the Presence of Artifacts’ available from our FAQ page:

For some time Bob has made important points about phase change and it is our understanding that what he is concerned about is not being able to get an estimate of the true phase characteristics of brain activity. We have taken his points seriously. Because it is important to us that we know how well our algorithms are performing and it is important that we continually revise them to make them better when a problem is found, we have been collecting statistics to demonstrate any trends in changed phase that are related to the cleaning process. These statistics are published on our website ( will be continually updated.


We invite you to check us out for yourself. If you’re interested in having your data cleaned, we can accept data either via secure S-FTP transfer or via drop box. The results we send back to you include both the ‘cleaned data’ and the ‘subtracted artifact’. It is easy to see what has been subtracted from the data by comparing the two files. Hence, you can account for changes to the data and you get the original data if you add the two files together.

We provide users of the service with both the cleaned result and what was subtracted so that they can verify that we haven’t altered the data. If it is warranted we can simply send the user of the algorithm a file that describes what to subtract from the data (i.e., the artifact) and then the actual subtraction of artifact would be done from a dataset that has not left the user’s custody. Please send your notes on this topic to my email address in the second paragraph of this message to make sure that I get them and I would be happy to come up with a workable solution.


In our applications, we use data cleaning to reduce the variance between study participants due to artifact to reveal the aspect of brain function we are investigating. Removing the artifact allows us to see what is really going on.

I believe Bob Thatcher also offers a data cleaning algorithm in his software. A colleague of mine sent the following link to me earlier today. (page 18; keywords: manual selection, automatic artifact procedure, artifact free template.)


(1) If you spend $100 worth of time cleaning a 5 minute data set manually, is it worth spending $10 to have a clean set of data.

(2) Data containing artifact is not OK. Yes, certain QEEG analysis algorithms are robust enough to be resilient to a certain amount of artifact but do we like to base our decisions on data that contain artifact?

(3) Research into brain disorders and mental illness can be expensive. When artifact masks statistically significant experimental effects then the researcher loses knowledge and opportunity, as does society as a whole. We offer one way to clean our data and make sure we did a good job of preserving measures of brain function.

(4) Manual inspection of EEG and selection of segments of data that may be free of artifact has costs in terms of both time and money. EEG data from some groups of people contain significant artifact at all points in time throughout the data. A solution that removes artifact in this case is very important.

Before I close this long technical email, I want to briefly talk about inspiration and moving our field forward. I attended the National Alliance on Mental Illness (NAMI) convention in Seattle this year and I was very much inspired by their own grass-roots initiative to create useful solutions to meet the mental illness needs of today and the growing issues around neurodegenerative diseases. At this conference I met many, many supporters of creating a technology that helps the countless people in North America and around the world with mental illness. To create technology that help diagnose, and helps us understand the cause of these neurodegenerative diseases and mental illnesses. Please consider joining our group to develop new assessment technologies and learn about our unique approach to this problem. (See our neuroscience research accelerator Linked In page:

Hopefully this email has answered some of your questions and alleviated some of your concerns. If any of you have any further questions, feel free to drop me a note at my email address and I’ll respond as quickly as possible.

(1) Improving health by getting more information from our data.

(2) Making innovations by identifying incorrect assumptions about data and creating robust engineering solutions using the latest techniques math and science have to offer.

Dr. Zeman, B.Eng (B.Sc.) 2000, Ph.D. 2009 (Interdisciplinary: Engineering, Biology, Psychology)

CEO, CTO, Applied Brain and Vision Sciences Inc.

Leave a Comment