The Evolution of Software Engineering Roles Due to Emerging Technologies.

August 29th, 2023

Table of Contents

Emerging Technologies and Software Engineering Roles

Introduction to Extended Reality


Extended Reality (XR) is a generalized term used to describe technologies that aim to combine the physical and virtual world into one, seamless experience. Some of the popular technologies that fall under this umbrella include Augmented Reality (AR), Virtual Reality (VR) and Mixed Reality (MR). XR is without a doubt an emerging technology as the current market size of $105.58 billion USD is expected to grow to $472.3 billion USD by 2028 (Mordor Intelligence, 2023). In addition to this, the Apple Vision Pro, a spatial computer that merges AR and VR, was announced early June this year.

The Impact of XR on Software Engineering Role


The rapid growth of advanced technology such as extended reality is bound to have various impacts on software engineering roles. One of the major impacts is enhanced remote collaboration and better communication among programmers (A. Elliott, Peiris, & Parnin, 2015). Working remotely is standardized in the software engineering field, and XR takes it a step further by letting developers based in different geographical locations join each other in a virtual live coding environment. This also promotes parallel development as there could be various teams (development, code review, design) working simultaneously.

That was a scenario where software engineers use XR to develop applications, but what if software engineers need to develop applications for XR? An impact of this would be that developers and designers would need to learn new skills and tools like 3d modelling, advanced animations, and spatial sound design to efficiently deliver functional software. Alongside this comes a vast number of challenges such as creating immersive experiences and ensuring user safety.

The Evolution of Frontend Web Developer Responsibilities


The responsibility of Frontend Web Developer, at its simplest, is to make user friendly web pages using HTML, CSS and JavaScript that are fast and responsive. In recent years, this role has seen various changes with the increased popularity of JavaScript frameworks (like React JS) and components based / modular development. Extended reality evolves this role’s responsibility in two main ways. Firstly, to make websites that incorporate XR, frontend web developers need to learn and master new XR frameworks. Some of these frameworks include WebXR, A-Frame and ReactVR (B. Alexey, 2021), which provides developers with the necessary tools, libraries, and APIs to create XR content that can be accessed via web browsers.

Secondly, frontend web developers need to learn how to create and implement immersive UI/UX design. It involves considering components such as spatial mapping and 3D visualization which are used to anchor virtual objects or information to the user’s physical space (Uxcel, 2023). Even more basic interactions like text fields need to be redefined as now instead of a keyboard and muse, the input methods are gestures, voice commands, and hand controllers.

Ethical Considerations and Challenges in XR


Despite groups such as IEEE promoting increased focus on ethics (for XR and technologies in general), there remains various ethical considerations (IEEE, 2023). The most obvious one being separation from the real world. Putting on an XR headset means blocking yourself from the physical world, which can become an issue when working among software engineering professionals. This limits the ability for peers to ask questions and denies the presence of a physical learning environment. Furthermore, there are a handful of challenges associated with the integration of XR into development teams, with health concerns being at the top of the list. According to StackOverflow’s study of 41,151 respondents, the majority of the software engineers work 8-9 hours a day, of which 2-5 hours is spent on coding (M. Elmar, 2023). Spending long hours in an XR headset can lead to nausea and dizziness. This is caused when the brain receives mixed signals i.e., the eyes register digital signals, while your inner the knows that the body is in the real world.

Extended Reality in Software Development Life Cycle

Requirements Gathering


In this phase, meetings are held with stakeholders and information on the problem domain is gathered. With the emergence of XR technologies like immersive augmented reality (IAR), requirements engineers can use this to better understand and respond to stakeholders (L. Merino, 2020). In practice, the requirements engineers will talk to the stakeholders as per usual, but while this is happening, the IAR headset will provide an interactive view of “nodes” or “entities” in real time. This will in turn help the requirements engineers to `visualize the relationship between concepts.

Requirements Analysis


The main purpose of the requirements analysis phase is to understand the information collected in the previous phase and to plan the general approach that will be used in the development. Complementary to writing notes and analyzing data by drawing on whiteboards, requirements engineers can utilize IAR headsets to digitize the notes and diagrams (L. Merino, 2020). This allows data to be organized in a spatial way and improves efficiency. In addition to this, large chunks of data can be decomposed and related data can be bundled freely.

Design


In the design phase, the overall look, feel and functionality of the product is decided. The use of IAR in this phase can allow designers to project UML diagrams (like use case diagrams) to a wall (L. Merino, 2020). This lets designers add use cases by drawing boxes and linking them using lines with makers, which function as a stylus. The software then takes these doodles and properly converts them to use cases. Built in microphones can be used to give names to these use cases because of voice recognition.

Coding


The design from the previous phase is converted into code and executable programs in this phase. Mixed reality can be used in many ways here, Firstly, developers can use IAR headsets and place recently viewed files to the left of their desktop and documentations for needed technologies to the on the right. This will drastically help boost productivity. Secondly, Remote pair programming (also known as distributed or virtual pair programming) can be done (J. Dominic, 2020). The navigator and driver can both use XR devices to work at a virtual desk from anywhere which promotes flexibility.

Testing


In this phase, software is assessed at various levels (component testing, integration testing) to ensure it meets the project standards. Similar to the design phase, testers can use IAR to virtually write and test code on a whiteboard with syntax highlighting. “City visualization” health checks can also be done using IAR (L. Merino, 2020). This is where buildings represent test classes and districts represent test suites. The lack of buildings in a district means that it lacks test coverage. The results Is an immersive and interactive way to test projects.

Deployment


In the deployment phase, the finished software is released and installed on user machines. Despite XR being more prominent in the other phases, it can still be used in deployment in niche ways like end user training. VR headsets could be used to replicate a virtual office and help prepare users for various expected, and unforeseen scenarios (S. Bennett, 2023). Such VR training simulators can include troubleshooting tasks whereby if the user fails to complete a task, a step-by-step guide will be displayed in the spatial environment.

Maintenance


Software is kept fully functional and newly discovered issues are fixed in this phase. Once again, the IAR headsets and whiteboard can be used here. In the case of fixing newly found issues, developers can have build configurations for the software displayed spatially and work on debugging issues. Collaborative software will then use cameras to capture and update the new build configuration (L. Merino, 2020). Even in a software update scenario, developers can brainstorm ideas and solutions faster. The whiteboard approach fosters a creative and collaborative experience.

Benefits of XR


As mentioned earlier, enhanced remote collaboration is one of the more significant benefits of using XR in SLDC. Platforms like skype offer connectedness but lack speed and instant messaging offers speed but lacks connectedness (A. Elliott, 2015). XR technologies fit between and above these as they offer speed, connectedness and add the third dimension of reality. This allows teams from separate locations to work seamlessly in a shared virtual space. Another benefit of XR is spatial data visualization. 2d data visualization, despite being easier to navigate, can get very clustered as projects and the dataset grow. Having the third dimension in XR means enormous amounts of data can be seen and analyzed more effectively. Similar to the “city visualization” mentioned in the testing phase, “Solar System visualization” (A. Schreiber, 2019) can be used. This is where packages are made into suns which have planets orbiting them. In this system, planets are classes, orbits are inheritance levels, and the sizes of planets vary on the lines of code (LOC) for that class.

Drawbacks of XR


Unfortunately, just like with every other technology, there are bound to be some drawbacks that need to be considered. The most obvious one being cost. The apple vison pro (which will be the most advanced VR headset when it releases next year) will cost $3,499.00 USD. If XR technologies like this would to be incorporated into software development, it would cost significantly more as it would need to have features and applications tailored to SDLC. Another drawback is accessibility, which can lead to the exclusion of users with disabilities. Due to the immersive nature of XR, sensory disabilities, especially vision issues will have a massive impact on the usage of the technology. AR and VR heavily rely on the user’s depth perception as it is responsible for mentally mapping out the physical environment in 3d. If users lack depth perception, the technology will no longer be immersive and thus there will be no point in using it.

References


Alexey, B. (2021, June 29) “Virtual Reality in Frontend Development: Tips and Trends you Need to Know”. Gorilla Logic. Source

Bennett, S. (2023, July 12). How to Use VR Training Simulator Software. WebinarCare. Source

Dominic, J., Tubre, B., Ritter, C., Houser, J., Smith, C., & Rodeghero, P. (2020) “Remote Pair Programming in Virtual Reality,” 2020 IEEE International Conference on Software Maintenance and Evolution Source

Elliott, A. (2015, January 16). Using Virtual Reality to Create Software: A Likely Future. Medium. Source

Elliott, B. Peiris and C. Parnin (2015) “Virtual Reality in Software Engineering: Affordances, Applications, and Challenges,” 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Florence, Italy. Source

Elmar, M. (2023, June 3) “How Many Hours a Week Do Software Engineers Work?”. CS Careerline. Source

IEEE (2022) “Ethics in Virtual Reality” IEEE Digital Reality. Source

Merino, L., Lungu, M., & Seidl, C. (2020) “Unleashing the Potentials of Immersive Augmented Reality for Software Engineering,” 2020 IEEE 27th International Conference on Software Analysis, Evolution and Reengineering. Source

Mordor Intelligence Research & Advisory. (2023, July). “Extended Reality Market Size & Share Analysis - Growth Trends & Forecasts (2023 - 2028)”. Mordor Intelligence. Source

Schreiber, A., Nafeie, L., Baranowski, A., Seipel, P., & Misiak, M. (2019) “Visualization of Software Architectures in Virtual Reality and Augmented Reality,” 2019 IEEE Aerospace Conference, Big Sky. Source

Uxcel (2023) “Extended Reality Design (XR)” Source