Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Finding compatible eye-tracking #163

Open
FBanani opened this issue Apr 25, 2024 · 10 comments
Open

Finding compatible eye-tracking #163

FBanani opened this issue Apr 25, 2024 · 10 comments

Comments

@FBanani
Copy link

FBanani commented Apr 25, 2024

Hi @ajdroid,

I have another question. How can I determine if the eye-tracking equipment is compatible with this driving simulator? For example, I found this eye-tracking equipment: https://pupil-labs.com/products/neon.
What criteria should I search for?

@shh1v
Copy link
Contributor

shh1v commented May 5, 2024

Hi @FBanani, we are using the Pupil Core eye tracker, and it's amazing! We had quite a lot of challenges integrating it, but in the end, the results turned out well. I haven't extensively researched Neon; however, I know it does not have a Network API that allows you to communicate as comprehensively as the Core tracker does. I would suggest you go forward with the research-catered eye tracker, Pupil Core.

@shh1v
Copy link
Contributor

shh1v commented May 5, 2024

Our source code of integrating the Pupil Core eye-tracker is available here.

@FBanani
Copy link
Author

FBanani commented May 5, 2024

Pupil Core eye tracker,

Thank you so much. My advisor bought the Neon last week. Hope I can integrate that. Thank you so much.

@FBanani
Copy link
Author

FBanani commented May 5, 2024

Our source code of integrating the Pupil Core eye-tracker is available here.

Do you think there is much difference between Pupil Core and Neon since both of them are from the same lab?
Also, have you had any documents for the installation of this eye tracker? I want to start.

Regards

@shh1v
Copy link
Contributor

shh1v commented May 5, 2024

Pupil core eye tracker has a comprehensive Network API that uses ZMQ PUP-SUB communication protocol. This ZMQ communication channel can be implemented in many programming languages, including C++. This is convenient since you can implement a framework for communication in Unreal Engine, although not trivial. On the other hand, pupil Neon has a limited API, that too in python.

Ultimately, it depends on your implementation goals. If all you require is to record eye-gaze data during trials, Neon can work just as good. However, if you plan to enable DReyeVR to read and use eye-tracking data (such as what we are doing. e.g., manipulating the head-up display based on where the driver is looking), then I suggest you return the Neon and get Core.

Edit: I never went forward with Neon just because it isn't open source. Thus, I have limited knowledge about it. You should consult with Pupil team to explore the capabilities of Neon.

@FBanani
Copy link
Author

FBanani commented May 5, 2024

Pupil core eye tracker has a comprehensive Network API that uses ZMQ PUP-SUB communication protocol. This ZMQ communication channel can be implemented in many programming languages, including C++. This is convenient since you can implement a framework for communication in Unreal Engine, although not trivial. On the other hand, pupil Neon has a limited API, that too in python.

Ultimately, it depends on your implementation goals. If all you require is to record eye-gaze data during trials, Neon can work just as good. However, if you plan to enable DReyeVR to read and use eye-tracking data (such as what we are doing. e.g., manipulating the head-up display based on where the driver is looking), then I suggest you return the Neon and get Core.

Edit: I never went forward with Neon just because it isn't open source. Thus, I have limited knowledge about it. You should consult with Pupil team to explore the capabilities of Neon.

Thank you so much for letting me know. I will contact the pupil company.

@mikelgg93
Copy link

Hi @shh1v , @FBanani

Miguel here from the Product Specialist team at Pupil Labs. I hope you don't mind if I chime in just a little bit. 😀

Pupil Core does a good job in XR, and over the years, lots of people have used it successfully. Nowadays, we would recommend Neon for XR applications, though, because it is usually easier to use and often provides higher-quality data. This is for a couple of reasons:

  • Neon’s gaze estimation is much more robust to slippage, which is often prevalent in XR. Neon’s pupillometry estimation is also generally more robust.
  • Neon is calibration-free after setup and does not require setting ROIs for eye cameras like Pupil Core.
  • The Neon module can be placed in different VR headsets using different mounts (and also in regular spectacle frames, making research more comparable between XR and the real world), while the Pupil Core add-ons always only work for one specific device, which enables some flexibility.

Neon’s gaze estimation pipeline is indeed not open-source, but most other components are. The fixation and blink detectors, Neon Player, or the Unity integration for example. So, we hope we have not lost all the points for openness! 😉

The Python client is certainly the easiest way to use Neon’s real-time API. But the underlying RTSP, HTTP, and WebSocket protocols can be interfaced with from any language in principle (See under the hood). In fact, the Unity integration uses a C# client, for example!

I’d be super curious to hear about any shortcomings you found or where do you feel the real-time API falls short, particularly with respect to DReyeVR's ability to read and use eye-tracking data.

@FBanani
Copy link
Author

FBanani commented May 16, 2024

Hi @mikelgg93,

I wanted to clarify a couple of points. Firstly, DReyeVR actually operates with Unreal Engine, not Unity. Thanks for noting that Neon works with Unity though.

Secondly, considering the additional cognitive workload VR headsets impose, I've decided to opt for eye tracking equipment to collect data when participants use monitors instead of VR headsets. What @shh1v has done is provide a method to gather all data from the simulation in each frame, including eye gaze and simulation metrics such as speed and accelerator usage. This will be immensely helpful for further analysis.
Moreover, since he provided the integration code, it will be incredibly time-saving for me (considering the time needed to develop integration code for Neon, similar to what he did for Pupil Code) and budget-friendly (as I'm not a computer expert, my advisor will likely need to allocate funds to hire someone for assistance). As a result, I lean towards using Pupil Core unless you confirm that the code he provided also works seamlessly for Neon.

Looking forward to your insights on this matter.
Thank you for your attention to these details!

@mikelgg93
Copy link

Hi @FBanani

Thank you for your detailed message. We can’t guarantee that Neon or Pupil Core will work seamlessly with any non-official third-party libraries. From the project description and the comments from the OP, it sounds like Pupil Core is indeed already integrated, and changing to Neon's real-time API would require rewriting some parts of the code.

If you already got Neon and want to give it a go and see how it works and how it would be the experience with Pupil Core, note that you can use Neon as if it was Pupil Core.

You ultimately make the choice of which eye tracker system you plan to use, and we can only provide recommendations based on what we believe would be best for your use case.

Regarding VR vs monitors, just so you know, if you would like to use monitors instead of simulating in VR, you would need to remap the gaze from the world camera to the screen using, for example, the marker mapper /surface tracker and April tags.

On a side note, I wanted to let you know that we can assist you with the coding integration through our custom consultancy packages. Email us, and we can look at the scope.

Cheers,
Miguel

@FBanani
Copy link
Author

FBanani commented May 20, 2024

Hi @FBanani

Thank you for your detailed message. We can’t guarantee that Neon or Pupil Core will work seamlessly with any non-official third-party libraries. From the project description and the comments from the OP, it sounds like Pupil Core is indeed already integrated, and changing to Neon's real-time API would require rewriting some parts of the code.

If you already got Neon and want to give it a go and see how it works and how it would be the experience with Pupil Core, note that you can use Neon as if it was Pupil Core.

You ultimately make the choice of which eye tracker system you plan to use, and we can only provide recommendations based on what we believe would be best for your use case.

Regarding VR vs monitors, just so you know, if you would like to use monitors instead of simulating in VR, you would need to remap the gaze from the world camera to the screen using, for example, the marker mapper /surface tracker and April tags.

On a side note, I wanted to let you know that we can assist you with the coding integration through our custom consultancy packages. Email us, and we can look at the scope.

Cheers, Miguel

Hi @mikelgg93

My advisor decided to work with Neon. As soon as I received it, I tried to modify the code. I hope you can help me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants