UtilityAR will be exhibiting at the National Manufacturing and Supply Chain event in Citywest, Dublin on the 17th of January
Please come and visit our stand as we will be demonstrating UtilityAR's software running on ODG, Epson, Vuzix and Microsoft headsets.
We will be exhibiting at the Technology, Innovation and Recruitment Conference in WIT on Thursday 11th of October
UtilityAR will be exhibiting at the Technology, Innovation and Recruitment Conference in WIT on Thursday 11th of October. Please come along and see us.
Please come visit out stand and have a demo of our solution focused on the needs of Utilities and Electrical Facilities staff
CIX are innovating to provide greater levels of security, reliability and transparency to their customers by using UtilityAR’s software. The Augmented Reality glasses based software helps the Data Centre to train staff more quickly and effectively along with providing them with guidance while completing tasks. It also allows them to offer their customers real-time remote eyes sight of their equipment in the event that the customers needs to trouble-shoot their infrastructure.
Augmented Reality Glasses and their role in reducing solutions inequity in society.
UtilityAR has a mission to bring Augmented Reality (AR) Glasses to the workforce across Europe and the world. But Why? We believe that AR glasses offers society an answer to some of the major issues which are challenging us today. It is a lofty and worthy ambition… a BHAG, but well worth our time and effort. And perhaps… your attention too.
Societal imbalance, unemployment and under employment
Unemployment and underemployment, particularly of young people is causing significant societal problems across Europe and the world. Many believe it is a significant factor in recent political upheaval, along with increasing instances of depression, obesity, and suicide. This marginalisation of large portions of society causes an imbalance which needs to be addressed. It is a terrible waste of potential and a very bad thing.
AR glasses can allow a less experienced technician to complete a job which they previously could not. In addition they can be used to train unqualified people more quickly and cost effectively. We believe there is plenty of work to be done, all we need are the people. AR glasses allow us to begin to address the imbalance in our society and bring people in from the margins.
Staff shortages and the deployment of distributed renewables
Across Europe and the world we see full employment of appropriately experienced, trained and qualified people. Though straight out of college graduates can struggle to find work, and older staff who don’t have the correct qualification also have trouble, experienced, trained and qualified workers are in great demand. The pressure on hiring departments to find appropriate people can force a reduction in standards which is dangerous. As explained above, AR glasses allow an organisation to put an old head on young (qualified) worker as well as training up an unqualified worker (to get the qualification) more quickly.
In additions the transition from centrally dispatched fossil fuel power stations to large numbers of distributed renewables creates a need for significant numbers of technical staff to deploy, maintain and operate the assets. The importance of deploying these renewables are clear. AR glasses allows the rapid upskill of workers to execute the change along with providing them with the tools they need to maintain the equipment.
Upskilling of work force in developing countries
As millions in the developing world join the middle class, the shortage of skilled workers to deploy new technology in these regions is a major barrier to progress. AR Glasses allow technology providers who are deploying equipment in these regions to upskill local workers quickly. This transfer of technology, skills and experience is positive for all, reducing travel and increasing the local knowledgebase. When combined with the opportunity (or necessity) for renewables deployment in these regions, we think AR glasses has a significant part to play.
We think everyone would agree that the issues highlighted above are serious and need to be addressed. We believe AR glasses linked to our technology is part of the solution. The problem is large however and the opportunity is great. It excites us. It is a worthy goal.
When developing Augmented Reality applications for Smart Glasses, some customers are confused about the options regarding pinning information to an object or point in space Vs heads-up display. This article explains our thinking on the matter.
Pinned to a point in space display
Pining an image to a point in space or an object means allowing virtual graphics or information to be located at a point in space or linked object in the real world. This could mean that relevant information sits above a real-world object (such as an assets name/info sitting above it in space) or a user could pin different information streams (work procedures, remote adviser call etc) in virtual space around them when they are doing a task. The technology requires the glasses to recognise the real-world items the graphics are being pinned too, the movement of the glasses and also their relative position to virtual items so when the wearer moves/turns their head the relative location of the virtual item will stay fixed to the real-world object or at the pinned point in space, not the wearers head (as is the case in heads-up display).
This allows the wearer to view the data they require in proximity to the relevant asset they are working on and works well when there are many assets in view, each with relevant information to be displayed. It also allows the wearer to have much more information pinned around them and so increase the amount of information they are consuming. Finally, when combined with 3D graphics it allows truly augmented experiences where a glasses wearer can see real looking graphics overlaid on a real-world object. This can be particularly useful for training.
On the other hand, this option can be slower to roll-out at scale, is more immersive (which in many applications is a bad thing – see my articles on this topic) and requires much more sophisticated glasses. Incorrect execution can lead to “jumpy” graphics as the glasses struggle to keep the various data streams together which can be off-putting.
Heads Up Display
“Heads up display” refers a display system whereby an image or information is located in a fixed position relative to the wearer's vision. This might be directly in the centre of their vision (for critical information such as instructions they need to read or an image they need to look at), or on the edge of their vision (for peripheral information such as a live feed from a meter etc which they need to be aware of, but not focus on). It means if the glasses/headset wearer turns their head, the text/image/video will move with them and still be located in the same part of their vision at all times.
This has the advantage of being easiest to see and control as the centre of our vision is where we have the best focus. Many believe this option is less distracting as the wearer already needs to be aware of their surroundings and this solution which puts the augmented information they require front and centre allows the wearer to focus. It is also clear to the wearer that the information is not part of the real world and so reduces the risk of confusion relating to immersion. Finally, it means that the relevant information is always in the vision of the wearer, rather than moving outside their vision as is the case with a pinned location display.
The drawbacks are that it is less immersive and so not suitable for many training applications. It also limits the quantity of information that can be available to a wearer. Finally, it does create a risk whereby the wearer could view the incorrect information relevant to the asset they are looking for as the information doesn’t move with the asset.
We receive many queries from organisations who would like to work with us on their ideas for Augmented Reality (AR) apps. Some of these applications are more suitable for Virtual Reality (VR) however, so we thought it would make sense to set out our thoughts on this as a post here.
Immersive Vs Non-Immersive Vs Minimal Immersion Experiences
In many cases, people hear AR but think VR. VR is an immersive technology whereby the headset wearer feels they are in a different place/world. In a business context, this can be very useful for training as it allows an employee to be trained up in a scenario, situation or on a piece of equipment which is hard to access in the real world.
We believe however that AR works best when a non-immersive or minimally immersive experience is required. When providing an operator with a piece of information to assist them to get their job done, we design on the theory that the operator should be present in the real world and so a non-immersive experience is best. What we mean is that if we are presenting the operator with data, text, images or video to assist them to do their job, they should easily be able to tell that the injected information is not a real part of their surroundings, and so they won’t mistakenly let it confuse them about what is real and what is not.
In some cases where training on a piece of equipment is required, AR can be used to give the trainee the impression that the state of the equipment is not as it actually is (image giving them the view that the filter is dirty, or the warning light is red, when it is not etc). We call this minimal immersion, as the trainee only sees a small part of their surroundings in an altered state. In general, they remain fully aware of their surroundings.
So to sum up, our approach in UtilityAR, when deciding between AR and VR, ask the question “How much immersion are you looking for in this application?” If the answer is that it should be an immersive experience, then go with VR. If you want the user to remain grounded in the location they are actually in, then AR is the solution for you.
To start, what is Augmented Reality (AR)? AR is a technology that allows the user to see a mix of reality and computer-generated images/text at the same time. It can be seen running through cameras (think Pokémon go or Layar), or more excitingly it can be used on new “Smart Glasses” or “Smart Headsets” which allow the wearer to both look at the world as normal while having images, text or animations superimposed upon their vision.
The advantage of phones or tablets is that most users already own a device. The advantage of glasses or headsets is that the wearer is hands-free and can also actually see the real world, not relying on a hand-held device to provide the image. The glasses have obvious attractions to H&S conscious industry who need to have their staff present in the real world. The information that can be superimposed on the vision of the wearer can be text or images which are relevant to the work they are doing while being hands-free also leaves the staff to get their job done.
So AR at its core provides an exciting new method of providing a technician, operator or field worker with the information they require while they are doing their job. The hands-free element of the device means that the worker can continue to do their job as normal, while the merging of this information with the real world allows it to be superimposed on the point where it is required. This leads us to exciting opportunities to highlight to a technician the exact button to push, screw to open or lever to pull. It also allows an operator to be shown arrows or pointers to the exact location where they are to receive a collection or bring a package.
If these advantages weren’t enough however, AR gets more exciting when combined with a number of forwarding facing camera-based technologies.
Most of the devices mentioned above have a forward-facing camera which can be used to provide relevant information to the computer in the device and so improve the quality of the data being shown to the wearer. Examples include using a QR code reader or similar (which runs in conjunction with the camera) or object recognition to allow the device to know what the user is looking at. This may mean identifying the piece of machinery being fixed by a technician or an asset being inspected by an engineer. Once the machine/asset has been identified, the AR device can superimpose relevant information on the device such as vital operational data, inspection checklist or 3-D imagery of the maintenance task that needs to be completed.
Another extremely exciting feature of the AR combined with a forward facing camera is “Remote Specialist” technology which involves a remote specialist being able to see what the device user is seeing through the forward facing camera (image a Skype video call where the camera being used at the technicians end is the forward facing camera on a pair of glasses) and then “draw” pointers or guidance notes on the vision of the wearer to assist them troubleshoot a problem. The specialist may ask the wearer to open a cabinet, turn a valve clockwise or press the green button. The technology allows them to point to the exact cabinet, draw an arrow to confirm which way is clockwise, and highlight which button is green (as opposed to red!).
So above highlights the basic technologies of AR. After that, it is up to you and your partners to come up with exciting and beneficial use cases. Asset management, checklist completion, troubleshooting, training, design… the possibilities are endless. Exciting times!
UtilityAR will be demo-ing at 3D camp in Dublin on Thursday the 23rd of November. We are delighted to participate in this event and hope to see you there