So, what’s the difference. (VR vs. AR)

AR or VR, VR or AR, it’s a longstanding debate, similar to asking yourself if you want apple or pumpkin pie for Thanksgiving dinner. Just like the pies, they are both amazing but the pros/cons for each differ. Let’s dig into it. Not the pies, VR and AR!

Augmented Reality (AR)

Augmented reality is an interactive experience of a real-world environment wherein the objects that currently reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. In simpler terms, it’s the real-time utilization of information in the form of text, graphics, audio, or other virtual enhancements which are integrated with real-world objects.

One of the coolest parts of AR is the fact that it isn’t limited to one form, there are many different forms of Augmented Reality, let’s dive into each and see how they are incorporated within your life right now!

This form of AR projects digital images on physical objects in a physical space. Let’s say you are moving into college soon and you are checking out your dorm but you don’t know where exactly the furniture is going to fit and how it’s going to look. Well, thanks to projection-based AR, you can hop onto the IKEA app which has AR integration and begin developing a plan for that cool dorm room.

In this case, a user is utilizing the IKEA AR mobile application to experiment with different couches, in order to find the perfect match for his/her home. There is an obvious projection of digital images in a physical space and that’s how this AR form works! Credits to Architect Magazine for the photo.

Recognition based AR is the most widely used form. QR codes and image recognition are two of the main applications for this sector of VR. An example of recognition-based AR in action is through the mobile app — scannable. As you shift the camera to take a picture of your project/paper, the app recognizes it and in turn it’s performing the action of scanning. Another example would be through social media, specifically Snapchat. I’m sure we all know of filters and this is another prime example of recognition-based AR in action since your face is the focal point in this example and Snapchat wants to input a dog face on yours.

I was going to snap a picture of myself using a snapchat filter but, decided against it! Anyway, take a look at the use of Augmented Reality within social media here and how each user can insert a filter based on the fact that Snapchat recognizes each figure.

Are you still using a map or have you transferred your travel strategy to google maps? Google maps uses location-based AR to detect your location and discover new places for you to visit. For example, if you are on the highway and are in need of a gas station, through location-based AR your smartphone can detect which gas stations are the closest and the best way to give you information that’s pertinent to your desired location.

PERFECT example for the use of AR in a map setting. Based on where you are (i.e. your location), google maps can use smart technology and Augmented Reality to show you the closest stores/restaurants/etc so that you can find a place that will suffice!

Car shopping in 2017 was one of my favorite experiences, since it allowed me the opportunity to see outlining AR through a current, working application. Outline AR’s most common used case is when you are in reverse attempting to get out of a parking lot and the car camera and touch-screen command center sync to show you where your estimated reversal spot is. Outlining AR is also very popular in architecture and engineering as it’s currently a tool that’s used for the outline of buildings and other industrial structures.

Personally, outlining AR is one of my favorite applications since I had the chance to gain exposure to this form very early on in my technological life. In this case, outlining AR is demonstrating trailer view to the driver, allowing them to see the car behind them wherein a normal car, this view would’ve been blocked. It’s showing the estimated spot and depicting it through AR image overlay.

Video games, doctor’s appointments, etc, Superimposition based AR has guided doctor’s and video game users and is one of the most common uses in augmented reality since it uses the main principles of object recognition and augmented view to project an image. An real-world example of this is when doctor’s meet with patients to discuss damage to bone structures. This technology superimposes an X-ray view of a broken arm on a real image, allowing the patient to understand what the bone damage actually is.

Doctors X AR! The first thing that came to mind when I learned about this form of AR was the scene in Dolphin Tale when the dolphin needs surgery and the X-Ray is projected, through AR onto the screen. In this case, a doctor is looking at an X-Ray as well through superimposition AR. Credits to Medium for the photo.

Now that we know some of the applications for AR, how does it actually work?

Augmented Reality is an instance where a virtual image is deployed over real-world objects. This overlay happens simultaneously from a camera or smart glasses creating the illusion that leads to an effective virtual world. Now let’s get into Virtual Reality (VR)!

Virtual Reality (VR)

VR Headsets can be referred to as HMDs which mean that they are head mounted displays. The sole purpose of VR is to cultivate a life size, 3D environment without boundaries so wherever you look, your face follows and you’re immersed in a totally different world. This is different from augmented reality which overlays various graphics into your view of the real world, VR creates a new experience through a headset.

When you wear a VR headset, and the picture depicted in front of you shifts as you alter the angle of your head, this system is called 6DoF or six degrees of freedom and during this process, the VR headset is plotting your head in reference to X, Y, and Z axis coordinates to measure head movements across the entire VR landscape. In addition to the six degrees of freedom, there are a couple other components that are used in the head-tracking system. Things like a gyroscope, accelerometer, and a magnetometer.

Oculus uses this formula to allow users to immerse in the 360 degree experience, but in this case, the head movement is apparent and from a user perspective, we have the chance to see how our movements are used in the context of VR specifically. Credits to: JMU for the photo

Gyroscope: this object is used to measure the rate of rotation around a particular axis. Through concepts such as angular momentum, the scope is able to indicate things such as orientation allowing the VR experience to continue, flowing and adapting based on head movement.

Accelerometer: this object measures liner acceleration based on vibrations. The accelerometer is designed to respond to vibrations which are associated with movement. The most popular/practical application for an accelerometer is a fitness device which tracks movement and other aspects of the exercise cycle. For example, if you are running, based on the amount of steps you take, that vibrates your device indicating to the smart tracker that you have taken another step and in turn it’s able to track those steps and turn it into a kilometer/mile measurement.

Magnetometer: this object is used to measure the strength and direction of the magnetic field in the vicinity of the instrument. The magnetometer can act as a compass and based on that innate quality, it’s able to tell which direction the user is facing and adjusts accordingly.

This is a HUGE advantage of premium headsets but companies are still working to find the most efficient way to role out this new innovative approach to tech. Currently, when users look down while being “plugged in” to a virtual headset, the first thing they want to do is see their hands and other body parts in a virtual space and this isn’t possible right now. Leap Motion — a technology that uses infrared sensors to track hand movements is the current form of integration with body part identification and this is in the front of Oculus development kits.

As usual, Oculus is ALWAYS taking the VR world by storm and Oculus Touch is a set of controllers that are designed to make you feel as if you are using your hands in a VR setting, similar to joysticks the main advantage is the fact that they give full control back to the user. The controllers involve two stations around the room that cover the area with laser technology and this enables the controllers to track your head and both hands, predicated on the timing of when they hit each sensor. This continues to increase immersion and engagement within the entire VR experience and provides a LOT of value to Oculus in terms of a longevity standpoint.

As discussed earlier, this is one application of motion tracking. In this case, VR is able to track how your body is moving allowing it to be more than a headset experience, but a full body one.

Sony — innovating in the area of controllers, similar to Oculus. Sony filed a patent and it’s founded on the idea of VR tracking based on light and mirrors which use a beam projector to determine positions for users. Woah, woah, woah. Imagine the possibilities.

Oculus — In terms of controllers and full body immersion in VR, as we touched upon in the motion tracking sector of the article, Oculus now has two sensors that can be delivered as part of the VR experience and have added another revenue stream since they began selling the option to purchase a third sensor for $79 and increase the overall VR experience/play area.

Eye tracking — This is deemed as the final piece of the VR puzzle but it’s not available on many headsets yet. Similar to hand motion and infrared sensors, the same technology is used for eye tracking and the FOVE headset is able to track where your eyes are moving inside the headset. Head movement isn’t enough? Yes, it is but the point of eye tracking is to increase the depth of field and continue to make VR more realistic through these nuances which increase the form of reality for the user. Simulation sickness is a HUGE problem that companies are researching and eye tracking is one of the ways we can reduce this.


Hey, hey thanks for reading the article! I would love to hear your perspective on intentional thinking! Check out my calendly, LinkedIn, and podcast to get in touch and give it a clap if it provided new insight/value to you!



maximize human potential

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store