Alright, so for GTA Work fewer students came to the office hours this week, probably because of the Spring Break which was a relief. However, I still had to respond to CS5950 students via email for help on the assignment even after the deadline, because not all students were able to submit on time. I also caught up on the GTA training courses and completed grading of Project 1 for CS1054.
The coursework, this week consisted of a few reading summaries and a proof-of-life presentation for the Advanced Topics project. I am still a few lectures behind on Machine Learning course and am hoping to catch up on them over the spring break.
My research work was not too kind on me this week. I did get the second HoloLens 2 to start testing Surf Share so that’s the good part. However, I have been running into a lot of technical issues ever since. I was able to deploy the build on both devices but getting them to communicate was complicated mainly because Surf Share had no configuration documentation and I was initially just going through their code to figure out how their networking had been implemented. Through this exercise I understood that the application is using two types of networking, Web RTC for video sharing using node-dss signaler and Mirror Networking over KCP protocol for object sharing. Since the flow of the application is such that both users need to configure the portal (object) first before video streaming starts, it is essential that the Mirror Network works fine before Web-RTC can be tested. The architecture of the application was also up for interpretation due to no documentation so initially I was assuming the first user to be Host (Server + Client) and the second User to be (Client). However, the IP configuration was not working out this way. Later I found out that the PC is supposed to serve as the server (for Mirror Network only, since the PC doesn’t share video it has its Node DSS Signaler and Web Cam object disabled) and both HoloLens 2 will connect as a client to it. Although, this configuration works fine when I run it in 3 unity instances and the logs are displayed in node-dss terminal too as far as the mere connection of these devices work. When I build them to the HoloLens 2 devices the connection fails. I ended up emailing the creator of Surf Share with my queries and got a positive response in their willingness to help me out with the IP configuration and explain how the networking component has been designed. They have set up a meeting with me in the coming week so I am hopeful that will help us move forward.
Testing out the technology is very important for my study to proceed as the task is dependent on what the tool is capable of. Nonetheless, I started to jot down the structure for my introduction/motivation, I had done that 2 weeks before but at that time I had structured it mainly around Telemedicine. After discussion with Dr. Ji and Dr. David-John I revamped the structure into a general Remote Guidance through Collaborative AR kind of a structure. I plan on having the new structure reviewed again before I start phrasing it out for a paper.
Lastly, I read another interesting paper in the same domain:
Sharing gaze rays for visual target identification tasks in collaborative
augmented reality
Author(s): Austin Erickson1 · Nahal Norouzi1 · Kangsoo Kim1 · Ryan Schubert1 · Jonathan Jules1 · Joseph J. LaViola Jr.1 · Gerd Bruder1 · Gregory F. Welc
Not very far from what we are trying to achieve with our project, this paper focus on the effects of different types of errors on user’s collaborative performance in AR. However, they specifically focus on shared gaze environment where errors introduced directly influences the shared gaze. They also determine the thresholds for error before the user performance deteriorates and compare the user’s subjective assessment of this threshold’s with the objective results.