EXPERIENTIAL DESIGN - Task 1: Trending Experience

 22/9/25 - 26/10/25 (Week 1 - Week 5)

✩ WONG MEI YEE 0367857

Experiential Design / Bachelor of Design (Hons) in Creative Media / Taylor's University

✩ Task 1 - Trending Experience


TABLE OF CONTENTS


Lectures


Instructions


Task 1 

AR RESEARCH & EXPLORATION

As part of my AR research, I reviewed several previous student projects from the playlist provided by Mr. Razif. These examples helped me understand how different AR tracking methods — marker-based, markerless, and location-based — can be applied creatively in experiential design. Through these case studies, I learned how AR designers balance storytelling, technology, and user engagement to create meaningful experiences.


1 – Marker-Based AR

Marker-based AR is one of the most common and accessible forms of augmented reality.
It works by detecting a specific image marker—such as a poster, brochure, or ticket—and overlaying digital content like 3D models, videos, or sound on top of it. This method is widely used in advertising, education, and entertainment because it offers stable tracking and a controlled user experience.

One of the student projects I reviewed used AR stickers as the image target, which triggered 3D models that appeared exactly the same size as the physical sticker.


https://youtu.be/RUAN4kN2SF0?si=I49EDDCzYdid9vmO


When the user scanned the sticker through the mobile camera, the flat printed image came to life — showing a small animated object or character that matched the sticker design.
This project was very inspiring to me because it demonstrated how small-scale printed materials can become interactive and playful through AR technology. The experience allowed users to view the sticker’s design from different angles and appreciate its form and texture in a more engaging way. What I found interesting was how the designer managed to preserve the real-world proportions of the sticker in 3D, making the AR object appear naturally integrated with its physical counterpart. It showed how AR can enhance tactile experiences, turning something ordinary and collectible — like a sticker — into a digital storytelling tool.
This kind of design merges graphic design and interactive media, creating a seamless connection between the physical and virtual world.

From a technical perspective, I learned that the effectiveness of marker-based AR depends heavily on image clarity, lighting, and camera distance.
If the marker is blurry or reflective, the tracking becomes unstable.
However, when executed correctly, this approach can deliver high visual impact with relatively low technical difficulty. What stood out to me most is its creative flexibility; designers can use almost any printed object as a trigger. It can be a business card, a product package, or even an event ticket. This opens up many possibilities for personalized and meaningful AR experiences.

Because of these advantages, I chose to base two of my own ideas — the AR Amusement Map and AR Personal Avatar Card — on marker-based tracking. It allows me to focus more on interaction design and storytelling rather than complex technical setup, while still achieving strong engagement with users.


2 – Markerless AR

Another example I explored was a markerless AR project from Taylor’s Design School called Patung AR.

 https://thedesignschool.taylors.edu.my/patung/#AR 


Another example explored markerless AR, where a virtual object — such as furniture, sculpture, or food — could be placed directly onto real-world surfaces. This method creates a strong sense of realism and immersion because users can walk around and view the virtual object from any angle. However, through my analysis, I realized that markerless AR requires precise surface detection and good lighting to work effectively. It also tends to consume more processing power, which can be challenging for mobile AR performance. Despite these limitations, I found this approach inspiring because it brings freedom of placement and supports realistic visualization.
It could be ideal for future commercial use such as interior design, retail product previews, or virtual exhibitions. Although I did not choose markerless AR for my Task 2 direction, I learned how it can enhance spatial awareness and realism in user experiences.


3 – Location-Based AR
Location-based AR is one of the most immersive types of augmented reality because it connects digital information to real-world locations using GPS, geolocation data, or spatial mapping. Instead of scanning a physical marker, the user’s position in the environment becomes the trigger that activates AR content. For example, when walking in a city or museum, users might see digital overlays, directions, or interactive 3D models appearing at specific coordinates around them.

Through my research, I found that location-based AR experiences are often used in tourism, navigation, education, and cultural storytelling. These experiences can guide users through historical landmarks, museums, or event areas by showing real-time information as they move from one location to another. This approach enhances situational learning and makes exploration more engaging because users interact with the environment both physically and digitally.

However, I also discovered that developing this type of AR is technically more demanding. It requires integration of several systems such as:

  • GPS or geolocation data for mapping user position,

  • Unity plugins or APIs for spatial tracking,

  • and network connections or backend databases to manage location-based content.

In the context of our module, it might exceed the scope of what we can build within a short semester using Unity and Vuforia alone. Nonetheless, this method is inspiring because it represents the future of AR, experiences that blend storytelling with real-world movement, allowing users to explore spaces like an interactive map come to life.

To visualise this concept, I found an online example of a location-based AR navigation system (image below).



 It clearly demonstrates how digital labels and objects appear dynamically as the user moves through real spaces. This helped me better understand how designers can create meaningful spatial connections between users and their environments.


From all these examples, I learned that marker-based AR remains the most practical for small-scale, fast-development projects like this module. It allows clear control over when and how AR content appears, making it ideal for individual projects with limited resources. However, I was also inspired by how students combined animation, interactivity, and emotional storytelling to make their AR experiences more engaging.


Weekly Reflections on Class Activities & Exercises 

Week 1 

In Week 1, I was absent from class due to an approved leave and therefore did not participate in the session. However, I followed up afterward by reviewing the module briefing slides and discussing the key points with my classmates to ensure I did not miss any important information.

From what I gathered, Mr. Razif introduced the Experiential Design module, its objectives, and the assessment structure. The module focuses on designing creative Augmented Reality (AR) experiences using Unity and Vuforia, where students will explore different types of AR such as marker-based, markerless, and location-based experiences.

Through reading the module brief, I learned that Task 1 focuses on research and ideation, while later tasks will involve building an AR prototype and reflecting on the development process. Although I was not present during the introduction, I now have a clear understanding of what this module expects, especially the importance of documenting research, weekly progress, and reflection.

This week’s follow-up helped me plan how to stay consistent with my learning, and I felt more prepared to participate actively from Week 2 onward.


Week 2 

In Week 2, I was absent from class and therefore missed the hands-on activity. However, I reviewed the lecture slides shared by Mr. Razif and organised my own notes to ensure I could still follow the learning progress.

The lecture focused on introducing the differences between Augmented Reality (AR), Mixed Reality (MR), and Virtual Reality (VR). I learned that AR enhances the real world by overlaying virtual objects, MR allows digital and physical elements to interact in real-time, while VR creates a completely immersive virtual environment. Understanding these distinctions helped me see how each technology fits within the broader field of Extended Reality (XR), which combines all three under one umbrella.

The lecture also explained the types of AR experiences, such as projection-based AR, head-mounted display AR, and mobile AR. Among these, I found mobile AR most relevant to this module because it can be easily developed using smartphones and Unity with Vuforia.

Even though I was not able to participate in class activities, reviewing the materials helped me understand the fundamentals of AR design and the importance of choosing the right AR type for different experiences. This knowledge gave me a stronger foundation for future Unity experiments and idea development.


Week 3

Lecture

In Week 3, Mr. Razif introduced us to Experience Design (XD) and how it connects multiple design disciplines such as Product Design, Service Design, Visual Communication, Interaction Design, and User Experience (UX). These fields work together to create holistic and meaningful experiences for users.

He also explained the difference between UX and XD. UX focuses on digital interaction — how users navigate interfaces like apps or websites, while XD covers the entire journey, including physical spaces, emotions, and sensory responses. In short, UX is part of the larger XD framework.

We also learned about the Empathy Map, a tool that helps designers understand what users say, think, do, and feel. This method allows us to step into the user’s perspective and identify pain points and opportunities before designing an AR experience.

From this lecture, I learned that good design is not just about usability or visuals, but about creating emotional and contextual connections that make an experience meaningful.
This lesson helped me see how AR can be used not only for entertainment, but also for storytelling and emotional engagement in user experiences.

Class Activity

Our group focused on the customer journey of buying a chair at IKEA Cheras MyTown. We mapped out each step, starting from entering the store to making payment, to understand the overall shopping flow and identify areas that could be improved with AR. We noticed that while IKEA provides strong guidance and inspiration through its layout and displays, the shopping process can still feel overwhelming — especially when navigating multiple sections, comparing similar products, or locating stock in the warehouse. To address these challenges, we proposed several AR-based features, such as an AI assistant to help visualize furniture in different colors or within a user’s room, AR stock availability displays, and an AR navigation system to guide customers directly to the correct aisle. We also suggested using AR for self-scanning and payment to shorten checkout times.

This exercise helped us see how AR can enhance convenience and engagement in physical retail environments by blending digital information with real-world shopping experiences.

Week 4 

In Week 4, Mr. Razif introduced us to Unity and guided us step by step on how to create a simple AR experience using Vuforia. We started with the basics, setting up an image target, activating the AR camera, and displaying a simple 3D cube that appeared when the image was scanned.

After successfully completing the demo, Mr. Razif encouraged us to experiment further by importing free 3D models into our AR scene.  When I tested it with my own image marker, the 3D object appeared perfectly aligned on top of it. It felt really exciting to see something I  created come to life through the camera.

This hands-on exercise helped me understand how marker-based AR actually works — from image detection and model placement to scaling and tracking stability. To make my project more interesting, I replaced the cube with a 3D alien and a small cat model that I downloaded online. Seeing both models appear and track accurately in AR gave me confidence to apply these techniques in my own Task 2 ideas later. Overall, this session was both fun and meaningful.  It helped me connect the technical process in Unity with the creative side of experiential design, where simple visuals can become interactive experiences through AR.


Week 5 

In Week 5, we continued working with Unity and Vuforia to explore more advanced interactive features for AR development. This week’s exercise focused on creating a simple menu interface and adding interactive buttons that control AR content and animation.

Following Mr. Razif’s step-by-step guidance, I first created a main menu with two buttons — Play and Credits. When I pressed Play, the scene switched to my AR camera view, where I could scan my image target to display my 3D alien and cat models. It was exciting to see the characters appear again, this time integrated within a structured interface that felt more like a complete app.

Next, we learned how to add functional buttons in Unity using simple C# scripts. I created Hide and Show buttons that controlled the visibility of my cube object. When I tapped Hide, the cube disappeared, and when I tapped Show, it reappeared. I also added Play and Stop buttons to control the cube’s bounce animation, which made the interaction more dynamic.

Finally, Mr. Razif introduced us to using a plane surface to display video and audio playback when scanning an image target. I tested this feature by attaching a short video clip to my target, and it worked perfectly — the video and sound played automatically when detected by the camera.

Through this activity, I learned how to combine UI design and AR interaction within Unity, which gave me a better understanding of how real AR applications are structured. It was very rewarding to see how simple buttons and scripts can make an AR experience more user-friendly and engaging.


3 AR Project Ideas

Idea 1 - AR Amusement Map

Problem Statement
Visitors often find it difficult to navigate large amusement parks because printed maps are confusing and static. Many people waste time searching for rides or facilities, especially when visiting for the first time.

Proposed Solution
Use an AR-powered 3D interactive map that appears when scanning the park ticket or wristband. This makes orientation more engaging and intuitive, allowing visitors to explore routes visually and receive voice-assisted guidance before or during their visit.

Concept

Scan the amusement-park ticket or wristband to unlock a 3D interactive park map that appears on screen.

Visitors can rotate the miniature park, tap icons for ride information, food zones, toilets, or routes, and explore in a playful, spatial way.

A new Live Mode feature simulates the visitor’s current position as a glowing light that moves along the route, helping users visualize navigation paths in real time.

To make the experience more engaging, sound cues and a friendly character voice guide users as they explore (“Let’s go to the roller coaster!” or “The snack zone is nearby!”).

Purpose

✴︎ Helps visitors quickly understand the park layout, plan routes, and reduce confusion.
✴︎ Turns navigation into an immersive, gamified experience that enhances excitement and orientation.
✴︎ Adds a storytelling layer through voice guidance and animation, creating an emotional connection to the park experience.

AR Type

Marker-based AR — the ticket or wristband (with QR code or printed icon) serves as the image target.

Input

Scan ticket or wristband (marker).

Output

A 3D interactive park map appears, featuring:

✴︎ Tappable icons for rides, food, restrooms, and exits
✴︎ Animated route lights showing directions
✴︎ A “Live Mode” glowing dot simulating user location and movement
✴︎ Voice guidance and ambient sound for immersive storytelling

Target Audience

Theme-park visitors, families, and event organizers are seeking an engaging orientation tool.

Tech Dependency

Unity + Vuforia Image Target • 3D park model • UI buttons • simple animation & particle system for navigation lights • Audio Source for sound cues • basic scripting for simulated “Live Mode” movement.

Story Flow

Stage

Description

Trigger

User scans the park ticket or wristband.

Explore

A 3D park map appears; the user rotates and taps icons to view ride details.

Live Mode

A glowing dot animates along the route, simulating real-time navigation.

Engagement

Character voice and sound cues guide the user to different attractions.

Result

The map visually and audibly enhances spatial understanding and excitement before the real visit.

Idea 2 - AR Menu

Problem Statement
In restaurants, customers often struggle to imagine how a dish will look or how large the portion is. Photos on paper menus can be misleading, causing uncertainty and wrong orders.

Proposed Solution
Create an AR menu system that allows customers to scan an image or QR code and view the dish as a 3D model on the table. This gives them a realistic preview of size, color, and plating — improving satisfaction and reducing order mistakes.

Concept

When customers scan a menu image or QR code, the chosen dish pops up as a 3D model on the table. They can rotate, zoom, view calories, or watch a short animation of plating.

Purpose

Makes ordering intuitive, customers visualize dish size and presentation before deciding, reducing uncertainty and mistakes.

AR Type

Marker-based AR — menu photo or QR code acts as the marker trigger.

Input

Scan the menu image or QR code.

Output

3D dish model with animations, ingredient info, and calorie details.

Target Audience

Restaurant diners, café owners, and food-expo exhibitors.

Tech Dependency

Unity + Vuforia Image Target • 3D food models • lighting & animation • UI for info buttons.

Story Flow

Trigger → Scan menu
Explore → Rotate or zoom 3D dish
Result → View details and decide before ordering.


Idea 3 - AR Personal Avatar Card

Problem Statement
Traditional business cards are static and often fail to leave a strong impression. In creative industries, it’s hard for designers and students to express personality or showcase their work through paper cards alone.

Proposed Solution
Design an AR business card that activates a 3D avatar when scanned. The avatar greets the viewer, introduces the owner, and links to their portfolio or social media — turning a simple card into an interactive self-introduction tool.

Concept

Scan a business card and a 3D animated avatar of the owner springs to life, waving and smiling in front of the camera.
The avatar has a stylized, low-poly cartoon aesthetic — friendly yet professional — to suit creative industries.
Floating icons appear around the avatar, linking to the person’s portfolio, social media, or contact info.
When the animation starts, the avatar also plays a short voice-line introduction, such as “Hi! I’m [Name]. Nice to meet you!”, adding personality and memorability to the experience.

Purpose

Transforms an ordinary name card into a memorable, interactive identity experience that reflects personal branding.
The animation and voice cue create a warm and human connection, helping designers, freelancers, or students leave a lasting impression during networking or exhibitions.

AR Type

Marker-based AR — card logo or QR code used as image target.

Input

User scans the business card (marker) with the mobile AR camera.

Output

✴︎ A 3D low-poly avatar appears on the card, performing a short waving animation.
✴︎ Floating icons hover nearby for portfolio, Instagram, or LinkedIn links.
✴︎ The avatar plays a voice-line greeting, followed by soft ambient sound and light effects.
✴︎ Users can tap the icons to open links or view additional profile info.

Target Audience

Students, designers, freelancers, job seekers, and exhibition participants who want to create a unique, interactive self-introduction tool.

Tech Dependency

Unity + Vuforia Image Target • Low-poly 3D avatar (rigged and animated via Mixamo/Blender) • Animator Controller for gestures • Audio Source for voice and ambient sound • UI overlay for clickable icons • Particle effects for light glow.

Story Flow

Stage

Description

Trigger

User scans the business card (marker).

Reveal

The 3D avatar appears and waves while delivering a short greeting.

Explore

Floating icons appear around the avatar; user taps icons to access portfolio or social links.

Result

A playful, personalized AR experience that makes networking more memorable and engaging.


Feedback

✩ Week 4
Mr. Razif advised me to include more detailed explanations within each idea so that the concepts can be clearly understood in terms of their user interaction, purpose, and technical process. He mentioned that I should elaborate more on how the AR experience works step by step, what the user will see, do, and feel, to make the proposal clearer and easier to visualize.
This feedback will guide me in refining my ideas for Task 1 by expanding on the interaction flow, storytelling elements, and technical implementation details.


Reflection


Comments