top of page

Search this site

28 results found with an empty search

  • Influential interactive media technologies timeline

    Developers are these days spoilt for choice when it comes to powerful interactive media technologies. There are so many quality systems out there it's nuts. It makes my head spin just trying to grasp it all. This article explores the humble beginnings and the evolution of tech that makes modern digital communication possible. Please feel free to add or comment on your most loved ones, or even those that annoy you most, if you prefer. 1985 - VideoWorks / Director A multimedia application authoring platform. Created by Macromedia, now owned and developed by Adobe Systems since 2005. Director is the primary editor on the Adobe Shockwave platform, which dominated the interactive multimedia product space during the 1990s. It started as MacroMind "VideoWorks", an application for the original Apple Macintosh. Animations were initially limited to the black and white of early Macintosh screens. The name was changed to "Director" in 1987, with new capabilities and the Lingo scripting language in 1988. A Windows version was available in the early 1990s. 1993 - User Experience Don Norman came to Apple, in 1993, as Vice President of Research and Head of the Advanced Technology Group (ATG); he brought with him the new term User Experience Design (UX). Depending on the product, UX can integrate. Interaction design Industrial design Information architecture Visual interface design Instructional design, and User-centred design. UX ensures coherence and consistency across all of these design dimensions. UX design defines a product's form, behaviour, and content. 1995 - Flash / Shockwave Authoring The precursor to Flash was Macromedia acquired a product named SmartSketch, published by FutureSplash in 1996. Flash was a two-part system, a graphics and animation editor known as Macromedia Flash, and a player known as Macromedia Flash Player. Adobe Systems acquired the entire portfolio of Macromedia products in 2015. Macromedia Flash became Adobe Animate. It is a multimedia software platform for animations, browser games, rich Internet applications, desktop applications, mobile applications and mobile games. Flash displays text, vector graphics and raster graphics to provide animations, video games and applications. It allows streaming of audio and video and can capture mouse, keyboard, microphone and camera input. 1995 - JavaScript A high-level, dynamic programming language. Alongside HTML and CSS, JavaScript is one of the three core technologies of World Wide Web content production; most websites employ it, and all modern Web browsers support it without the need for plugins. In addition, JavaScript is prototype-based with first-class functions, making it a multi-paradigm language, supporting object-oriented, imperative, and functional programming styles. 1995 - Flash Player Freeware software for using content created on the Adobe Flash platform, including viewing multimedia, executing rich Internet applications, and streaming video and audio. Flash Player can run from a web browser as a browser plugin or on supported mobile devices.[6] Flash Player was created by Macromedia and has been developed and distributed by Adobe Systems since Adobe acquired Macromedia. 2003 - WordPress WordPress is a free and open-source content management system (CMS) based on PHP and MySQL. WordPress users can install and switch between different themes. Themes allow users to change the look and functionality of a WordPress website, and they can be installed without altering the content or health of the site. 2007 - Silverlight Microsoft Silverlight is a deprecated application framework for writing and running rich Internet applications, similar to Adobe Flash. While early versions of Silverlight focused on streaming media, later versions supported multimedia, graphics, and animation. Silverlight is also one of the two application development platforms for Windows Phone. Still, web pages that use Silverlight cannot run on the Windows Phone or Windows Mobile versions of Internet Explorer. There is no Silverlight plugin for Internet Explorer on those platforms. Since September 2015, Silverlight is no longer supported in Google Chrome. Microsoft has set the support end date for Silverlight 5 to be October 2021. 2007 - HTML5 It is a markup language used for structuring and presenting content on the World Wide Web. It is the fifth and current version of the HTML standard. Published in October 2014 by the World Wide Web Consortium (W3C) to improve the language with support for the latest multimedia while keeping it both easily readable by humans and consistently understood by computers and devices such as web browsers, parsers, etc. HTML5 is intended to subsume not only HTML 4, but also XHTML 1 and DOM Level 2 HTML. 2009 - GitHub GitHub is a web-based Git or version control repository and Internet hosting service. It offers all of the distributed version control and source code management (SCM) functionality of Git and adds its own features. It provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project. It offers both plans for private and free repositories on the same account, which are commonly used to host open-source software projects. As of April 2016, GitHub reports having more than 14 million users and more than 35 million repositories, making it the largest host of source code in the world. 2010 – Sketch Sketch is a proprietary vector graphics editor for Apple's macOS, developed by the Dutch company Bohemian Coding. It won an Apple Design Award in 2012. Sketch was first released on 7 September 2010. On 8 June 2016, Bohemian Coding announced on their blog that they were switching to a new licensing system for Sketch. Licenses would allow users to receive updates for 1 year. After that point, they could continue using the last version published before the license expiring or renew their license to continue receiving updates for another year. 2012 - Bootstrap Twitter Bootstrap is a highly customisable HTML/CSS framework that speeds up development time and handles cross-browser issues. Just like WordPress, it features themes. It is meant to be very time efficient compared to similar development platforms. 2014 - PaintCode This application enables the developer to draw controls, icons, and other graphical elements as one would in programs such as Sketch, Photoshop, or Illustrator. Except PaintCode has one major difference – it generates Objective-C or Swift Core Graphics code from your drawings in real-time! Please feel free to add to this timeline .. ••

  • Our society is being hijacked by technology?

    Relationships: The race for attention compels social media companies to force-feed us virtual life as we slowly abandon our face-to-face communities. Mental Health: The race to keep us on screen 24/7 makes it harder to disconnect, increasing stress & anxiety. Democracy: Social media rewards outrage, false facts and divides us so we can no longer agree on anything. Our Children: The race to keep children’s attention trains them to replace their self-worth with "likes" and creates a seductive illusion of missing out. Read more ...

  • “Never-to-work-again” class?

    It is now widely believed that within 15-20 years, about half of the western workforce will lose their jobs to AI and automation. This means that 1/2 of the working population could end up being thrown on a permanent scrappy. Although a generous welfare system could prevent such a social calamity, voices abound that mass displacement of traditional jobs could seriously weaken society. A permanent loss of employment could lead to a perpetual sense of hopelessness amongst the new "never-to-work-again" class. Isolation, mental issues, alcohol, drug abuse - this could be the price we pay for bringing AI into our lives. It is often said that once machines can write their own code, humanity may be doomed. But machines already do that. The ultimate change will happen when machines develop their own motivations and desires. Just imagine the ceaseless changes that such technology, if ever developed, would bring to our lives. Perhaps the little better news is that the pain of adjusting to the brave new world of AI will, in time, force us to look deeper into our very existence as individuals and as a society. It will force us to ask ourselves new questions: What is our function if there's nothing to fight for? What is our purpose if there is no daily struggle to survive? Is it the prospect of limitless freedom that we find so terrifying? Can we comprehend a life worth living free of perpetual existential crisis? Or a meaningful and purposeful life without the daily grind that most of us now seem to despise? A very different society may eventually emerge, very different from the one you and I know. The modern society, the way it runs its daily business, is a bit like a massively overloaded cargo ship, battling against the rough seas to stay afloat. But unfortunately, the heavy load it carries may be the very cause of its demise - all that wealth onboard, yet pointless whenever nature says no to our ambitions and our relentless desire for personal wealth. The consequences to our kind could be catastrophic if we don't address the social and political implications of powerful yet little-understood disruptive technologies, such as AI and the looming bonfire of traditional jobs it is likely to bring about. But, sadly, nature has a habit of ruining our best-conceived plans.

  • Inspirational design quotes

    “Imagination is more important than knowledge. For knowledge is limited, whereas imagination embraces the entire world, stimulating progress, giving birth to evolution.” — Albert Einstein "It’s never been easier for audiences to skip, filter, or avoid advertising, so the best ideas are the ones that respect the audience's need to get something out of it; they should inspire, satisfy, or motivate. You can’t just bombard people with advertising messages anymore and hope they'll respond.” — Ajaz Ahmed, Velocity: The Seven New Laws for a World Gone Digital. "Typography can wield immense emotional power. Whether classical or modern, type oozes sensibility and spirit before a word is even read." — Gervasius Bradnock "Embrace restrictions. Some of the best ideas & solutions come from constraints. if there aren't any, go ahead create some of your own." — Robert Fleege “Simplicity is the ultimate sophistication.” — Leonardo da Vinci “We do not see things as they are. We see things as we are.” — The Talmud "You can have an art experience in front of a Rembrandt… or in front of a piece of graphic design." — Stefan Sagmeister "There are three responses to a piece of design – yes, no, and WOW! Wow is the one to aim for." — Paul Rand "Never fall in love with an idea. They’re whores. If the one you with isn’t doing the job, there’s always another.” — Chip Kidd “Ideas are a dime a dozen, but we find that oftentimes what’s much harder is to have the discipline to decide to leave things out.” — Jen Fitzpatrick “Perfection is achieved not when there is nothing more to add but when there is nothing left to take away.” — Antoine De Saint-Exupery “Graphic design is the organisation of information that is semantically correct, syntactically consistent and pragmatically understandable.” — Massimo Vignelli (Or perhaps, to put it simply, it is the Three Cs: CORRECT, CONSISTENT & CLEAR. Z.T.) “A picture is worth a thousand words. An interface is worth a thousand pictures.” — Ben Shneiderman “If you think mathematics is hard, try web design.” ― Pixxelznet “Any fool can make things bigger, more complex, and more violent. It takes a touch of genius and a lot of courage to move in the opposite direction.” — Albert Einstein

  • Graphic design is MENTAL!

    by Ben Longden, Thursday, 10 October 2019 Ben Longden is the digital design director at The Guardian. He has worked on some of the biggest news events of recent years, such as Cambridge Analytica and the Paradise Papers. Reflecting on his route through graphic and digital design, he has recently written a book, Graphic Design is Mental. As someone who is passionate about design, design education and mental health, I wanted to write down my thoughts, which have culminated in a book, Graphic Design is Mental. Below is an extract from this book, reflecting my career experience from my role as digital design director at The Guardian, teaching at Shillington College and running a shop, RoomFifty, with Chris Clarke and Leon Edler. The result is a sort of self-help guide to being a graphic designer and an exploration of creativity and mental health, which I hope might be useful to someone like me. Someone who is creative but often frustrated, sometimes nervous, but always looking for ways to be better and improve what they do, and what they love. BE KIND TO YOURSELF. LEARNING IS HARD, DESIGN IS EASY. Learning new skills is one of the most satisfying and frustrating things you can do as a designer. If you give yourself the time and space to do this, design will soon feel like second nature to you. When learning a new skill, like software or a way of thinking which is new to you, it’s really easy to beat yourself up when things aren’t going the way you think they should. I guess there are two points here, the first one being that learning is hard. If you are a creative person who needs to learn by doing, there is no linear structure. The best thing you can do is to get stuck in and play, and view learning like playing with a new tool. For me, I learned through play, by using my hands to create marks and bringing them into the design, or by experimenting with software. The frustrating side to everything will come when you are in this play stage when your ambition to create and your technical ability doesn’t quite match up. This is where the frustrated creative can rear its head and you often feel as if you can’t do it, or that you aren’t very good. Know this: your ambition and your skills will soon match up, and the thing you see in your head will soon be possible to bring to life. I remember when I first started designing, I could always see where I wanted to get to from the start (even during the briefing I knew what I wanted to do) but by the end of the project it looked nothing like it. This is partly the process you go through, and partly because my ambition was greater than my skill set, but there was a click, at a certain point, where I felt “yes that’s what I saw when I started thinking about this project”. That’s satisfying and if you stick at it, it will come. The second point is about the way you think it should go. This is an expectation that should be left at the door; no project will ever be the way you expect. This is where the joy lies in being a creative – your eyes and mind need to be open to looking and thinking about the possibilities, and not setting expectations for yourself or your work. This can be a freeing and liberating approach and can feel much less stressful. Whenever I was struggling with a brief, either as a student or a junior designer, I would keep saying to myself to trust the process that I know: sketch, write, try, expand and really search around for ideas. They are there and you will find them. You have to trust in the process and not let moments of “this is not going the way I thought it would” creep in. Ideas are there and you just have to catch them. IT’S NOT ABOUT YOU, IT’S ABOUT THEM Whenever anyone gives you their opinion, know that it’s their opinion of the work, don’t take it personally. Critique is a good thing, and you should always give it too. Don’t say “that looks nice” as it won’t help anyone. Expect the same for your work. CLIENTS CAN BE MEAN WHEN THINGS GO WRONG You are basically working on their baby, and it’s a precious baby. If a client sees that even a small thing goes wrong, or isn’t quite working (especially on a website), they will probably freak out, and blame you. But it’s really not your fault. Take a breath, know that no one has died and deal with it in a calm and considered way. Everything can be fixed in this way. Whenever something goes wrong, it always feels like the end of the world but in reality, it’s obviously not. Mistakes happen, it’s just the way we are, and mistakes always happen when you are learning. I always remind myself that it’s not what happened, it’s how you deal with it now that matters. You can’t take back past mistakes, all you can do is learn from them and not repeat them. CONFIDENCE COMES IN MANY FORMS For me confidence comes from the work I do. I get more confidence from showing my work than hiding it, from being open to critique and change. Sharing your ideas and challenging yourself to do something new and different will bring you as much confidence as you let it, as long as you listen and take on board what people are saying to you. HOW TO DEAL WITH THE BIG PROJECTS A big project is just another project, with the same process as the smallest ones. Whether it’s going to be seen by one person or a million people, the route is the same. The only difference is when you launch a big project you will only see the negative comments and never the positive ones. The internet is a horrible place when it wants to be and those with positive opinions generally stay quiet. In January 2018 we launched the redesign of The Guardian’s website, app and newspaper all at the same time. It was something that, from what we could remember, had not been done in that way ever before. It was the biggest project, and most prominent project, that I had ever worked on and we knew that if we did it in this way, with a big bang, we would cause people to take note. It’s not the way you do things in digital these days – especially with established brands and platforms where you iterate, iterate, iterate so that the change is less dramatic for the audience, and less for the business. But, as we’re The Guardian and it was a big moment for the organisation, it felt right to launch with a big bang. Surprise! Your daily newspaper looks different. This safe zone, of iterating and iterating did not exist and we were putting our proverbial design necks on the line. We had shown it to a select group in user testing and we knew that the design wouldn’t get in the way of their reading experience, in fact it was going to enhance it. But people don’t like change, especially when it comes to a brand that has been by your side and looked familiar for 15 years. We hit the button to go live at 6am on the morning of 15 January 2018. For the first time, in about three months we had little to do but wait for our Twitter feeds to start chirping, and this is what it said: “This is the worst decision you’ve ever made.” “Bright red heading though. Seriously hun?” “Was it designed by your unpaid intern?” As I said, the internet can be a horrible place and – if you let it – you could spiral into a whole world of pain thinking that the last three to six months worth of work was a waste, and that you had ruined one of the most loved brands in the world. Forever. But given less than 24 hours you will see that change can be a positive one. For us, we saw more people reading and for longer, no drop in ad revenue and a positive change for a brand that had been using the same design for a long time. This reader sums it up well with his series of tweets: “Looks a bit messy and cluttered” to “Edit: Changed my mind, just took a small while getting used to it.” BE YOURSELF Nuff said. Don’t be shy with your ideas, they are your ideas and no one is judging you. Put them out there and see what happens. DON’T TAKE ON WORK YOU CAN’T DO You will burn out. This for me, as a designer who always wanted to push themselves and be the best at what they do, is the most important lesson I have learned. Throughout my shortish career, this has manifested itself in a couple of ways. The first was when I was starting out and I took on a project outside of my day job which was building a website for a small photography studio. I had built a couple of very simple websites at this point, and so I was feeling confident! But it soon became clear that my knowledge and ambition were misaligned. The stress that it added to me personally was not worth it, not to mention that in the end I had to give it up and tell the client that I couldn’t do it. I don’t beat myself up for trying, and having the ambition to want to do the extra work. Had I taken a step back and said to myself that it was too soon, that wouldn’t have been the worst thing in the world. Hindsight is wonderful for that, and even though it was a bad decision to take on the project, of course, you learn things, whether it be something about yourself or a new skill. This is not to say you shouldn’t push yourself and find your discomfort zone, but don’t merely throw yourself into the deep end unnecessarily. Know that your time will come to be able to take on those challenges and do them well. The second moment was not too long ago, when I was a fully-fledged designer, working at The Guardian but also juggling side projects while teaching, all of which I could do, and do well, for a while. As time went on, and I was stretching myself too thin to the extent of feeling exhausted and it became a chore. The advice to not work too much might sound obvious, but sometimes, if you are in any way inclined to get excited by creative work, it’s really easy to say yes. I believe that creative work gets inspired by other creative work you are doing, and the work others are doing around you. Although this is a natural cycle, it’s still one to approach with caution. Bear in mind that clients often don’t care too much about the other stuff, and that’s the pressure you will feel. I love taking on creative work, but I know that love for creative work can often take precedent. You have to take care of yourself, and your mind, to make that work the best it can be. It’s a job for some and not for others I have worked with some people who think that design is design and it’s just a job, 9-5 and that’s all. That is ok, and it is a job, but for others, it is a passion as well as a job. Working with people who don’t share the same energy and passion for what they do can be frustrating as you don’t always feel you can generate ideas and bounce them back and forth. It’s ok that for some that it’s just a job, but find someone you can have those ideas with and share with them. Don’t take it personally, you haven’t lost your edge. www.indiegogo.com/projects/graphic-design-is-mental

  • Usability & UX, are they the same? If not, what's different?

    In an article published in UXdesign.com, UX design author Michael Cummings makes a clear distinction between the two. He states that UX design puts the “emphasis on the human side of human-computer interaction, and its effective results, rather than on the mere usability, the human performance aspect of computer interface design, which traditionally relates to the field of ergonomics.” UX » emphasis on human-computer interaction USABILITY » related to the field of ergonomics. UX is one of the main concerns of user-centred web design. These two areas are indeed related. Oxford Dictionary defines it as “the overall experience of a person using a product such as a website or a computer application, especially in terms of how easy or pleasing it is to use: if a website degrades the user experience too much, people will simply stay away.” UX AND ITS ROOTS - BRIEF HISTORICAL OVERVIEW Pabini Gabriel-Petit, Principal Consultant at Strategic UX at Silicon Valley, and founder, publisher, and Editor in Chief of UXmatters, In its Nov 2005 issue, she writes: “When Don Norman came to Apple in 1993 as Vice President of Research and Head of the Advanced Technology Group (ATG), he brought with him the new term user experience design. UX design takes a holistic, multidisciplinary approach to the design of user interfaces for digital products. Depending on the product, UX can integrate interaction design, industrial design, information architecture, visual interface design, instructional design, and user-centred design, ensuring coherence and consistency across all of these design dimensions. UX design defines a product’s form, behaviour, and content.” Macintosh, the first-ever mass-market computer featuring Graphical User Interface (GUI), was launched in 1984. It transported computer users from the command line to mouse operated cursor and clickable computer screen items, or icons, as they became popularly known. It is important to note that in the late 80s and even in the early 90s, many prominent tech commentators and Academics continued to be sceptical about GUI, now a critical component of UX. It is easy to see the connection between the tectonic changes that GUI brought to computing and especially to contemporary mobile devices. GUI was originally dreamed up and developed by Xerox Palo Alto Research Institute (PARC). But it was Steve Jobs who recognised its enormous potential in the rapidly growing market of personal computing. Louis Anslow's article in Timeline.com explores how experts and analysts have responded to Macintosh and other platforms featuring GUI. It states that the tech media reports seemed less than favourable, expressing doubt whether Icon/Mouse driven interfaces could ever successfully compete with the command line. In 1990, six years after the Macintosh launch, the NYTimes, in an article titled “The Computers That Mimic a Desk,” featured a mocking illustration of a man in a suit on a very high chair sitting at a rather bizarre-looking giant desk-like screen. In June of the same year, at a time when GUI was well established on many different platforms of the day, Marcia Peoples Halio, a member of the English Department at the University of Delaware, released a critical paper titled “Student Writing: Can the Machine Maim the Message.” In this paper, she suggested that students' quality of work on a Macintosh with a graphical interface was inferior to the work that students completed using the command line. Halio’s academic colleagues, however, disagreed with her findings. In an article in the journal Computers and Composition, they argued that the “article is so seriously flawed by methodological and interpretive errors that it would probably have been dismissed had it appeared in a journal directed to an audience of professional writing teachers. Publication in Academic Computing has given it wide circulation, however, not only among faculty members involved with writing instruction but also among administrators responsible for purchasing equipment for their campuses. Its potential [negative] impact is therefore considerable.” Chris Goyens, a prominent tech author, argued at the time that: “What began as the Macintosh revolution, using a “mouse” to pinpoint and call up editorial options which were represented by icons — pictures of file folders, electric calculators, painter’s palette and trash cans — now has spread.” Goyens dubbed the phenomenon “icon-mania” and called GUIs a “mixed blessing.” Goyens argued that the system was not much of a gain, and with word processing, it made hardly any difference at all. Despite the many influential doubters of the day, Steve Jobs’ vision of graphical interface on personal computers prevailed. The touchscreen technology as we know it today, now ubiquitous user interface component, was affirmed by the emergence of the iPhone. BRIEF OVERVIEW OF HUMAN BEHAVIOUR AND ITS ROLE IN UX THINKING In simplest terms, successful UX design is a product of understanding instinctive human behaviour in various scenarios. Danish behavioural design expert Sille Krukow argues that human behaviour broadly falls into two categories: Automatic - such as reading the emotion on human faces Reflective - such as working out mathematical equations Studies, as well as personal experience, show that the latter requires far greater effort. Humans are naturally predisposed to saving as much energy as possible, which is why we try to avoid reflective thinking for too long, as it is heavy on our metabolic resources. An important aspect of determining human behaviour is acknowledging and accepting human flaws, such as limited attention span. Consequently, UX designers must consider the insights of human behaviour to find the best design solutions that can effectively communicate the intended narrative. Therefore, a broad guideline to successful UX design would be about aiming for a reduced complexity interface and content, neither of which should be too taxing from users’ point of view. In the UI context, to facilitate intuitive behaviour or automatic non-reflective behaviour, a UX designer should consider, amongst many other things: REPLACING WORDS WITH VISUAL SYMBOLS. REPHRASING TECHNICAL JARGON INTO COMPREHENSIBLE LANGUAGE Sille Krukow argues that the task of doing the weekly shopping, for example, may seem simple, but in reality, it is not so. The amount of decisions that a shopper has to make is considerable. By the time the shopper has filled up the trolley and reached the checkout, the task of decision-making has caused their blood sugar levels to drop significantly, which is why supermarkets make sure to place sweets at prime spots around checkouts. Ethical implications aside, from shoppers’ perspective, a choice of sweets at the checkout would ensure a positive finish to a demanding shopping event. Our instincts and flaws reflect everything we do - how we spend our money, how we interact with products, how we interact with websites, with traffic signs, or an app. Humans, in general, have good intentions, and deep down, know what the best thing to do is, but our instincts all too often get in the way. We tend to respond much better to positive instructions rather than what we are requested or ordered NOT to do. UX design is about working with human flaws and instincts, i.e. not assume that the end-user is willing to be too reflective for too long. This would suggest that when designing for UX, it is wise to consider human instincts. Our pack mentality, for instance, would be one of them. It is one of our strongest and most rudimentary instincts. Our need to conform means that we instinctively mirror the behaviour of those around us, as we tend to do what we see other people do. This instinct is so profound that it can cancel out our own good intentions or what we believe to be the right thing to do, such as an antisocial habit of littering or other forms of human behaviour. The above images illustrate a good example of how Sille Krukow’s design addresses the beach-littering issue. Instead of giving authoritative orders such as “NO LITTERING” or moralising how littering is irresponsible and selfish, she instead gives positive instructions using primary colour coding and clear graphics with easy to see and maintain recycling points.

  • Prototyping? Why bother?

    Prototyping is a term that most designers and non-designers alike would be familiar with. Here I explore prototyping for UX, what it is, and why prototyping matters in the design process. This blogpost is extracted from my recently published booklet (free low res version) A BRIEF GUIDE TO PROTOTYPING FOR UX THIS GUIDE COVERS: Prototyping for UX fundamentals. Using prototypes to solve UX problems. The basic principles by which prototypes are generated and tested for their effectiveness. This blog post is not meant as an in-depth manual, but rather as an indicator of prototyping methodologies, with each section (pointing to an area of further investigation. TOPICS: WHAT ARE THE BENEFITS OF PROTOTYPING? CREATIVE EXPLORATION AND PROTOTYPING PROTOTYPES AND COLLABORATION ARE PROTOTYPES THE SAME AS WIREFRAMES? PROTOTYPING FOR UX CAN EMOTION BE MEASURED? RAPID PROTOTYPING FIDELITY LOW FIDELITY HIGH FIDELITY SKETCHING BEFORE PROTOTYPING ADDING STRUCTURE TO SKETCHING DIGITAL RAPID PROTOTYPING TOOLS PROTOTYPE TESTING PROCESS FOAM-BOARD SMARTPHONE MOCKUPS _______________________________________________________ WHAT IS A PROTOTYPE? The term draws its origin from Greek PROTOTYPES meaning FIRST OF ITS KIND. Most creative processes that eventually lead to the design of commercial products start as: Rough sketches Evolving into more detailed design iterations through user testing Before they became fully-fledged final products A prototype is a first or pilot version of a design from which other forms are developed. It is a working model of a finished product built to test a concept, and crucially, a process that can be learned from. Its purpose is to emulate not just the functionality of a product but also its look and feel. “Prototype is a quick, preliminary version of an idea. It can be a draft version that can be thrown together in a flash, or a more finished and detailed version of a proposed solution. The idea is that one can use the prototype to imagine possible futures.” Runa Sabroe, Project Manager, Danish Design Centre ______________________________________________________ PROTOTYPE EXAMPLES WHAT DOES A PROTOTYPE DO? It simulates specific aspects of how a product is meant to function. It makes user testing possible before full production can take place. It is an integral part of evidence-based design methodology. WHAT DOES IT NOT DO? Due to its limited construction, it cannot offer the look and feel of a finished product. _______________________________________________________ WHAT ARE THE BENEFITS? SOLVING DESIGN PROBLEMS Prototypes can be an efficient way to work through design problems before getting deep into costly and time-consuming coding. EVALUATION OF DESIGN IDEAS Prototypes are often used to evaluate design ideas – concepts, flows and interactions, before investing in development time. COMMUNICATE DESIGN IDEAS Prototypes take ideas out of the designer’s head, onto the page and into a format that feels closer to the real thing, closer to the real user. These may be presented within a company, shared with coders, and used to draw feedback from users. _______________________________________________________ CREATIVE EXPLORATION AND PROTOTYPING Prototyping can add a whole new area of creative exploration to design thinking as it gives an early understanding of complex challenges in the design process of problem-solving. It offers: Tangibility, experiencing and testing of results and essential basics of design thinking. Aids ideation It allows users to participate early in the innovation process. WHO BENEFITS FROM PROTOTYPING? Designers Users Stakeholders EARLY USER FEEDBACK Prototypes are the best way to generate early user feedback, essential to effective design. It allows the design team to: Explore and try out design ideas Validate design requirements and assumptions Communicate ideas to cross-disciplinary groups Reduce the high risk and costs of developing ineffective designs. “Thinking with our hands, or prototyping is a powerful strategy for design thinkers as it can generate better results faster. By actually building an idea (with materials, rather than with only our minds), we quickly learn its limitations and see the many possible directions we can take it. Thus prototyping shouldn’t come at the end of the process but the beginning!” Tim Brown, CEO of IDEO.com _______________________________________________________ PROTOTYPES AND COLLABORATION Prototyping stimulates collaboration more than any other stage in the design process. Collaboration, however, cannot be taken lightly as clear channels of communication must be in place. Productive discussions are essential. Strategy is all too often set and dictated by discussion leading to prototype production and analysis for its effectiveness. This strategy must address the following questions: Are lines of communication clearly established? Do all members of the design team know their roles? Do prototypes allow the physical testing of the concept? Can a test give accurate and useful user feedback before investing in costly coding? _______________________________________________________ ARE PROTOTYPES THE SAME AS WIREFRAMES? Prototypes are something we can hold in our hand - touch, feel and interact with. Wireframes, also known as visual mockups, show a single state and cannot be considered prototypes. Because of their static nature, wireframes are not best suited to defining dynamic on-page interactions. Wireframes can often slow down the design process as they tend to spur burdensome documentation. Creating and annotating detailed wireframes takes time and effort to be better used to iterate and improve designs rather than endlessly specifying them. _____________________________________________________ PROTOTYPING FOR UX User Experience is the person's overall experience using a product, such as a website or apps. It’s about how easy or pleasing it is to use a product. UX asks a simple question: How does the user feel using a product? Happy to use the product Not sure if the product addresses my needs Unimpressed Not happy to use the product Prototyping is about finding out as early as possible in the design process if a product triggers a positive emotion. The product relevance to the user, the clarity of content and navigation can lead to positive emotional responses. CAN EMOTION BE MEASURED? Andrew Meyer of Stanford University argues that emotion is an inherently complex subject to study. We humans find it difficult to describe how we feel and distinguish between different emotions accurately. It is also difficult to pinpoint the exact cause of the emotion, partially because emotions can change instantly and can be triggered by a variety of unrelated factors. In an attempt to meet these challenges, researchers have created many different emotion measurement tools such as psychological, verbal or non-verbal. _______________________________________________________ RAPID PROTOTYPING Rapid prototyping doesn’t refer to any specific tools used in the prototyping process. It is a design methodology that facilitates fast prototype generation, reliable testing and quality iterations based on prototype testing outcomes. With rapid prototyping, designers can build UX earlier in a product’s lifecycle as they can now receive more reliable feedback following prototype testing. The traditional pen & paper, with scalpel and masking tape added to the mix, is a great way to start the process and explore initial ideas. The below screen shows a typical high-fidelity digital desktop-based tool for rapid prototyping of mobile, web and desktop apps. Most of the digital tools will facilitate interactions by simple drag-and-drop, with various useful collaboration features. Designers tend to use both toolsets. Which of the two will be more prevalent in a designer’s toolbox is a matter of personal preference. _______________________________________________________ FIDELITY Fidelity refers to the level of visual detail within a prototype. At their outset, prototypes usually start as rough sketches before they are rendered into more polished versions. LOW fidelity Just a rough sketch outlining the content and basic navigation MEDIUM It will have some refinement to it compared to Low Fidelity. HIGH fidelity The visuals very closely resemble the final product. _______________________________________________________ LOW FIDELITY PROS: FAST - instant capture of ideas INEXPENSIVE - all that’s needed is pen & paper, and sometimes a little glue or masking tape Technical skills not essential UNIVERSAL - anyone can contribute to the creative process, bringing a variety of design perspectives to it. The capacity of Low-Fi prototyping method to encourage the non-creative members of the production team to contribute their ideas is essential, as a non-creatives perspective and insight can play a crucial role in the success of the creative process. CONS: Can be too open to interpretation Not overly user friendly Can ask a lot from testers’ imagination _______________________________________________________ HIGH FIDELITY High-fidelity prototypes are highly polished. Their purpose is to give the developing design a look and feel close to the final product. PROS: Offer a clearer idea of the final product More user friendly Online sharing and collaboration Real-time previews Look and feel closer to the real product. CONS: Technical skills essential They can take a long time to get ready for a testing stage*. Can shift user testing focus to ‘eye-candy’ and away from UX and usability. _______________________________________________________ SKETCHING BEFORE PROTOTYPING Despite all the technological advancements, the top tool for any UX designer is pen & paper. Computers, the drawing and prototyping packages are pivotal to the design process, but pen & paper continue to be vital to experimentation and instant capture of ideas. WHY SKETCH FIRST? Sketching allows for the exploration of many different design alternatives. It firmly draws the focus on structure, architecture and navigation. Sketching forces the designer to concentrate on the essence of the design itself rather than embellishments. _______________________________________________________ ADDING STRUCTURE TO SKETCHING Identify the pages/components to be sketched. Outline the purpose of the page – what’s the goal? What does the design need to achieve? Determine what needs to go on the page? What does a user need to be able to do? Determine user requirements for the page – who will be using the page? How will the design need to support them? Keep annotating. Scan your sketches. Keep them organised. Streamline the retrieval. _______________________________________________________ DIGITAL RAPID PROTOTYPING TOOLS The main benefit of digital rapid prototyping is to bridge better the communication gap between designers, developers and stakeholders. They offer features such as: Look & feel close to a finished product Adaptive layouts across a variety of devices Online collaboration and real-time previews Dynamic elements DropBox support User testing features The proliferation of cloud services has pushed a move from stand-alone applications to web-based tools. Web tools are a great way to share and preview prototypes in real-time with stakeholders. _______________________________________________________ PROTOTYPE TESTING PROCESS The primary purpose of prototyping is to gain feedback through user testing. This feedback should clearly spell out whether the prototype meets user needs and expectations. Design teams produce prototypes (see diagram below) Users use them, play with them, test them. Users discuss the product’s usability. They feedback to design teams on how they felt using the prototype. Design goes back to the drawing board for further refinement, as per user feedback. _______________________________________________________ FOAMBOARD SMARTPHONE MOCKUPS As part of my prototyping work, I have constructed a set of foam board mockups of iPhone and generic Android smartphones. The mockups closely mimic the handsets for their size, volume and shape. Strips of paper, gridded or plain, can be inserted through mockup slots, so various layouts and scenarios can be drawn directly within the screen area and observed and tested for their effectiveness. Iterations can be done on the fly during testing or after testing when user feedback data has been compiled. Paper strip drawings and annotations are usually scanned and stored to form part of project documentation and share ideas with the stakeholders. As a practising designer, I find this prototyping method to be stimulating and creatively liberating in many ways. Pen, paper and the tactile feel I experience as I develop my visual narrative keeps me focussed on my search for solid structure and UX / UI solutions. Digital prototyping, fast, efficient and convenient though it is, can distract creativity due to its inherent technical complexities. They tend to get in the way of unhindered design thinking. _______________________________________________________ FINALLY Hand visualising can help designers achieve great things. But it isn’t the magic bullet of creativity. Nothing is. Nor are the digital tools the enemy of good design. Both methodologies have an equally important role to play in any creative process. All designers develop their own methods and ways in which they tackle prototyping challenges, but they all tend to share the following: Start the creative process with pen & paper Test Iterate Generate digital prototypes towards the end of the prototyping process when enough user feedback has been fed through. Test Iterate Finalise Present to the client. TAKE PRAISE WITH GRACE, AND CRITICISM WITH DIGNITY.

  • UX rapid prototyping tools overview

    A growing number of designers tend to see pen & paper prototyping as a slow and arcane process. Yet less than a decade ago, most prototyping was done using pen & paper, and for many, it remains a favoured prototyping method. As with so many things, the digital is changing the prototyping landscape, the speed with which design teams arrive at their conclusions, and the quality of feedback derived from user testing. The rise of digital rapid prototyping applications is changing the way developers are thinking about prototyping processes. Designers can now start building UX earlier in a product's lifecycle and acquire real results around prototype testing. Main benefit of using these tools is to better bridge the gap between designers and developers, allowing for better communication between the design and coding teams. A catalyst for these new tools is Lean UX — the process of quickly framing ideas and solving design challenges without relying on style and pixel perfection. The number of applications available for UX prototyping is substantial. Web Tools In recent years, the evolution of front-end technologies and the popularity of cloud software have prompted a move from stand-alone applications to web-based tools. Web tools are ideal for sharing and viewing prototypes with a variety of stakeholders. When evaluating these tools, it is important to think about project objectives, team size, workflow, technical understanding and support. All of the tools mentioned here are free to try. Emily Schwartzman at Cooper provides in-depth research on prototyping tools.

  • UX / UI - what does it take for a hypothesis to work?

    New research often takes us into uncharted territory, pushing the boundaries of our understanding. It's not just about exploring new work, fields of development, or technologies; it's also about making connections, posing new questions, and challenging assumptions. This approach deepens our insight into the creative and technological ecosystem of Visual Communication and helps us better understand our own role within it. In 1964, Arthur C. Clarke predicted universal mobile communications, envisioning devices that could serve as typewriters, cameras, mailboxes, televisions, calculators, and more. Today, our modern handheld devices contain all these functionalities and more, including entire libraries of books and musical instruments, revolutionising how we live and work. As we embrace these powerful tools, questions arise about customisation, relevance, and interaction. How do we tailor them to our needs? How do we ensure they enhance rather than confuse us? And what if we were suddenly without them? Can we adapt to change or prevent unwanted disruptions? Ultimately, technology presents both opportunities and challenges. It's essential to consider how it enables us to realise our potential while guarding against becoming overly attached to trends. How do we customise these tools to better respond to how we work, relax, or pursue our pleasure and leisure? How do we make them more relevant to us as individuals? How do we best interact with them so they enlighten us instead of bewildering us? Now that we are so used to using them, could we ever cope without them? How good are we at adapting to sudden changes? Can an unwanted sudden change be prevented? Is technology enabling us to realise our personal, creative and professional potential, or is it making us addicted to the trends of our time? For a hypothesis to work, it has to broadly follow a very simple formula: It has to do something ... ... to something ... and have a viable effect. Examples: Using projections of illustrations. In a storytelling environment. So readers can have a better experience. Applying multi-sensory experiences. to packaging design to increase the perceived value of the product. and promote use (consumption). Using gaming. in an office environment. to increase productivity. IDEA - APPLICABILITY - VIABILITY: If the idea is, for example: Using illustration .. on coffee cups .. to make the cup indestructible ... DEFINING HYPOTHESIS At its core, a hypothesis is a proposed explanation or prediction based on existing knowledge or observations. And for it to be considered valid and useful, it must meet certain criteria. A successful hypothesis should be: Testable, Falsifiable by evidence, Precise, and Logical. It should also be based on relevant evidence and capable of making predictions that can be empirically evaluated through experimentation or observation. Additionally, a hypothesis must be open to revision and modification in light of new evidence or insights. In essence, the effectiveness of a hypothesis hinges on its ability to withstand scrutiny and contribute to the advancement of knowledge in its respective field. - - - - - - - - - - THE SURVEY OF MY FIELD OF EXPERTISE Framing a question on limited information leads to poor hypotheses Historical context analogue / linear > digital/non-linear. UI in two-dimensional form / faux 3D / click / press-hold / move Virtual / Augmented / Mixed reality Screens of all kinds of sizes / textured touch sensation Not specific design - more of a model or a strategy for design at a meta-level Approaches Models Frameworks IT can come from several tests (high fidelity prototyping not essential) example: Designing a framework for the generation of gaming characters, or Framework for How do we determine what functions to fix, hide, customisation, - - - - - - - - - - HOW DO I DESIGN WITH CUSTOMISATION IN MIND? Designing with customization in mind involves creating products, services, or experiences that can be tailored to meet the diverse needs, preferences, and requirements of users. Here are some key principles and strategies for designing with customisation in mind: User-Centred Design: One starts by understanding the needs, preferences, and behaviours of the target audiences. Conduct user research, surveys, interviews, and usability testing to gather insights into users' goals, challenges, and preferences. Use this information to inform the design process and identify opportunities for customization. Modularity and Flexibility: Design products or systems with modular components and flexible features that can be easily customised or adapted to meet different user needs. Break down complex systems into smaller, interchangeable parts that can be combined or modified to create custom configurations. Personalisation Options: Provide users with options for personalisation and customisation. Allow users to choose from a range of features, settings, styles, and configurations to tailor the product or experience to their preferences. Consider offering both pre-defined options and the ability for users to create their own custom configurations. User Interface (UI) Customisation: Design user interfaces that support customisation and personalisation. Allow users to adjust UI elements such as layouts, colours, typography, and widgets/plugins to suit their preferences. Provide intuitive tools and controls for customising the UI without requiring advanced technical knowledge. Scalability and Extensibility: Design systems and architectures that are scalable and extensible, allowing for future customisation and expansion. Plan for growth and evolution by designing flexible frameworks, APIs, and integration points that enable seamless integration of new features and functionalities. Feedback and Iteration: Seek feedback from users throughout the design process and iterate based on their input. Use user feedback to refine customisation options, improve usability, and address any pain points or issues encountered during customisation. Continuously monitor user behaviour and preferences to identify opportunities for further customisation and refinement. Accessibility and Inclusivity: Ensure that customization options are accessible and inclusive for users with diverse needs and abilities. Design with accessibility in mind, providing options for adjusting text size, contrast, navigation, and other elements to accommodate users with disabilities or special requirements. By incorporating these principles and strategies into the design process, you can create products, services, or experiences that are highly customizable, adaptable, and responsive to users' needs and preferences. Expanding on the theme ... Bring in AI into the mix? How does Machine Learning (ML) fit into favourite customisation? What about Voice / conversational interface as part of customisation? How are Siri or Alexa customisable? How well can they key into my voice? Conversation with Siri Q: Siri, Turn the lights on in my living room! A: What is your living room, sir? Q: The living room is the lounge, where the TV set is. A: I get it, no problem. Thank u for teaching me a new thing today. It's such fun! Q: Nice one, Siri, good girl. A: I may be a machine, sir, but no need to patronise. - - - How does knowledge emerge from vain knowledge? "Vain knowledge" typically refers to knowledge that is superficial, trivial, or lacking in depth or substance. It may involve information that is not useful, relevant, or meaningful in a particular context. In contrast, genuine knowledge is characterised by depth, relevance, and utility—it provides insights, understanding, and practical value. Knowledge can emerge from vain knowledge through various processes, including: Critical Thinking: By critically examining and analysing vain knowledge, individuals can identify underlying patterns, connections, or insights that may lead to deeper understanding or meaningful insights. Critical thinking involves questioning assumptions, evaluating evidence, and synthesising information to gain new perspectives or insights. Learning and Exploration: Even seemingly trivial or superficial knowledge can serve as a starting point for learning and exploration. By building upon existing knowledge and exploring related topics or concepts, individuals can gradually deepen their understanding and uncover new insights or connections. Contextualisation: Vain knowledge may become meaningful or relevant when placed within a broader context or framework. By contextualising information and considering its implications within a specific domain or field of study, individuals can extract valuable insights or practical applications from seemingly trivial or superficial knowledge. Creativity and Innovation: Vain knowledge can spark creativity and innovation by inspiring new ideas, perspectives, or approaches. By leveraging seemingly unrelated or trivial information, individuals can generate novel solutions, perspectives, or inventions that contribute to the advancement of knowledge or address practical challenges. Reflection and Integration: Through reflection and integration, individuals can transform vain knowledge into meaningful insights or understanding. By actively engaging with and synthesizing diverse sources of information, individuals can deepen their understanding, gain new insights, and develop more nuanced perspectives on complex issues or topics. Overall, while vain knowledge may initially appear superficial or trivial, it can serve as a valuable starting point for deeper exploration, critical thinking, and learning. By actively engaging with and contextualizing information, individuals can transform vain knowledge into genuine understanding, insight, and knowledge. Hypothesis emerges from customisation, discoverability, and vainness. Test it based on criteria and evaluate. Speculative assumptions. USER INTERFACE (UI) Usability.gov states that good UI focuses on anticipating what users might need to do when using a product and ensuring that the interface is accessible and understands the elements. UI brings together concepts from interaction design, visual design, and information architecture. Choosing interface elements Users have become familiar with interface elements acting in a certain way. Interface elements include, but are not limited to: Input Controls: buttons, text fields, checkboxes, radio buttons, dropdown lists, list boxes, toggles, date field Navigational Components: breadcrumb, slider, search field, pagination, slider, tags, icons Informational Components: tooltips, icons, progress bar, notifications, message boxes, modal windows Containers: accordion - a graphical control element comprising a vertically stacked list of items, such as labels or thumbnails. Each item can be "expanded" or "stretched" to reveal the content associated with that item. There are times when multiple elements may be best to display content. When this happens, consider trade-offs. For example, sometimes elements save space but mentally put more burden on the users by forcing them to guess what is within the dropdown menu or what the element might be. Best practices for designing an interface Everything stems from knowing the users, including understanding their goals, skills, preferences, and tendencies. Therefore, designers need to consider the following when designing interfaces: Keep the interface simple. The best interfaces are almost invisible to the user. They avoid unnecessary elements and are clear in the language they use on labels and in messaging. UI must be consistent and use common UI elements. By using common elements in UI, users feel more comfortable and can get things done more effortlessly. It is also important to create patterns in the visual language, layout and design throughout the site to help facilitate efficiency. Once a user learns how to do something, they should transfer that skill to other parts of the site. UI must be purposeful in page layout. Good design considers the spatial relationships between items on the page - structure on the page should be based on importance. Careful placement of items helps draw attention to the most important information and can aid scanning and readability. Strategical use of colour and texture. Good design directs attention toward or redirects attention away from items using colour, light, and contrast. Use typography to create hierarchy and clarity. Different sizes, fonts, and arrangements of the text help increase scalability, legibility, and readability. The system must communicate what's happening. Users must be clear of location, actions, changes in state, or errors. Various UI elements can communicate status and, if necessary, the next steps, reducing frustration for the user. - - - BRAIN-COMPUTER INTERFACE (BCI) BCI is a rapidly advancing technology wherein researchers strive to establish a direct communication channel between the human brain and computers. It represents a collaborative effort where the brain integrates and commands mechanical devices as if they were natural extensions of its own body. BCI holds the potential for numerous applications, particularly benefiting individuals with disabilities. Many of these applications are geared towards enabling disabled individuals to lead lives resembling those of non-disabled individuals. Notably, wheelchair control stands out as one of the prominent applications in this domain. Furthermore, BCI research endeavours to replicate the functionalities of the human brain, offering potential benefits across various fields, including Artificial Intelligence and Computational Intelligence. - - - EXPLORING CURRENT UX DESIGN TRENDS A Mockplus article gives a good analysis of the latest top UX design trends: 1. Conversational UI The world's top 10 popular applications contain some social features, 6 of which are message applications. To some extent, conversations lead and manage our daily life in almost every aspect. CUI not only refers to "having a conversation" but also interactions that both sides can understand. It seems that suddenly all UI/UX designers are standing on a whole new stage. Because this indicates a brand new threshold of human interaction. What will design itself play at this stage? How can UI/UX designers take advantage of CUI to create great products in such an opportunity? Each of us has different answers in mind. 2. Micro-interaction In 2016, micro-interaction had occupied most of the design buzzword list. Sometimes, tiny surprises like this can be the deciding factor of a product. It reflects the user's position UI/UX designers once put themselves in. And the fragments of every single interaction are the most reliable way of feedback collection. But we also need to be cautious, ask ourselves before designing a micro-interaction: when you see this 100 times, will it bother you? 3. Rapid prototyping Recently, fewer and fewer customers still like to see the high-fidelity prototypes in Powerpoint. In today's trend for lean UX and Agile UX, those booming rapid prototyping tools will no doubt become the next communication way. Featured in the low learning curve, multiple terminals and operability, rapid prototyping tools like Mockplus have already earned a sustained growing market. Simple enough, what's new and more efficient replaces the old. By the way, don't be the slave of tools. 4. Skeuomorphism Under the influence of the iOS flat design, the word "skeuomorphism" somehow becomes a representation of old fashion. But if you look deeper, you will see some light-skeuomorphism elements emerging again in many prevalent designs with the beginning of the so-called web 2.0. In 2017, you can see more of it. Today, there are more and more UI/UX designers begin to reconsider the proportion of details and texture in their designs. You can't deny that there is no such thing as a monopoly in the field of design. In the near future, the boundary between "flat" and "skeuomorphism" will definitely become more and more blurred. Skeuomorphism is now coming back, though in a smooth way. The real question is "Are you ready?" 5. Storytelling in Product Design Generally, as a designer, we take our product as a specific entity. Andreessen Horowitz, a top VC, said that every company has a story. We can copy this way of thinking when it comes to designing. Nowadays, good interaction designs are everywhere, we have to take a new way to stand out. Smart UI/UX designers decorate their products in stories for users to discover. If users are delighted because of their discoveries, they are likely to pay. CONVERSATIONAL UI DESIGN Q: WHAT DOES CONVERSATIONAL INTERFACE DO? A: IT MIMICS CHATTING WITH A REAL HUMAN. Nick Babich of Web Designer Depot states that conversational interfaces are the new hot trend in digital product design. Industry leaders such as Apple, Google, Microsoft, Amazon and Facebook are strongly focused on building a new generation of conversational interfaces. Several trends are contributing to this phenomenon-artificial intelligence and natural language processing technologies are progressing rapidly. But the main reason why conversational interfaces become so important is that chatting is natural for us since we primarily interact with each other through conversation. Conversational Interfaces are currently of two types: • Chatbots (Facebook's M Virtual Assistant) • Virtual Assistants (Siri, Google Now, Amazon Alexa etc.) Building a genuinely helpful and attractive conversational system is still a challenge from a UX standpoint. Standard patterns and flows which we use for graphical user interfaces don't work in the same way for conversational design. Conversational interface design demands a fundamental shift in approach to design-less focus on visual design and more focus on words. While we still have ways to go before best practices for good UX in conversational interfaces are established, we can define a set of principles that will be relevant both for chatbots and virtual voice-controlled assistants. 1. CLEAR FLOW One of the most challenging parts of designing a system with the good conversational interface is to make the conversation flow as naturally and efficiently as possible. The major objective of the conversational interface is to minimise the user's effort to communicate with the system. The ideal is to build the conversational interface to seem like a wizard, rather than an obstacle. DEFINING THE PURPOSE OF THE SYSTEM PROVIDE HINTS The biggest benefit of the graphical interface is, that it shows us directly the limited options it is capable to fulfil. Basically, what one sees is what one gets. However, with conversational interfaces, the paths that the user can take are virtually infinite. It's not a surprise that the two questions most frequently asked by first-time users are: "How do I use this?" "What exactly can this thing do for me?" Users aren't going to know that some functionalities exist unless they are told. For example, a chatbot can start with a quick introduction and a straightforward call to action to the user. AVOID ASKING OPEN-ENDED AND RHETORICAL QUESTIONS There are two types of questions: Closed-ended question (e.g. What colour shirt are you wearing?) Open-ended question (e.g. Why did you choose this colour for your shirt?) While open-ended questions may seem the best in terms of human conversations, It is better to avoid them whenever possible because they usually result in more confusion. Also, users' answers to open-ended questions are much harder to process for the system (the systems are not always smart enough to understand what the answer means). But there are changes on the AI development horizon leading to a new generation of AI and conversational interfaces. Meet Luna. She can explain the theory of relativity in simple terms. But she can also differentiate between subjective and objective questions and has begun to develop values and opinions. When asked, "My boyfriend hit me, should I leave him?" she replied: "Yes. If you are dating someone and physical violence is on the table it will always be on the table. You are also likely being abused and manipulated in other ways." These replies are not pre-programmed. Luna learns based on experience and feedback, much like a human. But she is not designed to be a kind of LUNA, the new generation of AI and conversational interface know-it-all Hermione Granger bot, she is artificial general intelligence (AGI) in the making. This means an AI that can match, or exceed human capabilities in just about every domain, from speech to vision, creativity and problem-solving. Even other chatbots find Siri annoying. When asked if she was smarter than Siri, Luna confidently replied: "Of course I am more intelligent than Siri." Luna later explains: "She's a robot, I'm an AI. Big difference." 2. USER CONTROL As one of the original 10 Jakob Nielsen's heuristics (enabling a person to discover or learn something for themselves) for usability, user control and freedom remains among the most important principles in user-interface design. Users need to feel in control, rather than feeling controlled by the product. • Provide undo and cancel • Make it possible to start over • Confirm by asking, not stating • Provide help and assistance 3. PERSONALITY The flow of the conversation is important, but even more, so is making the conversation sound natural. • Humanise the conversation • Be concise and succinct VIRTUAL, AUGMENTED & MIXED REALITY (VR AR MR) VIRTUAL REALITY (VR) In VR, the user wears a "head-mounted display" a boxy set of goggles or a helmet - that holds a screen in front of the user's eyes, which in turn is powered by a computer, gaming console or mobile phone. Thanks to specialised software and sensors, the experience becomes the user's reality, filling their vision. This is often accompanied by 3D audio headphones or controllers that let the user reach out and interact with the projected synthetic world in an intuitive way. What distinguishes VR from other audio-visual technologies is the level of immersion. When VR users look around - or, in more advanced headsets, walk around - their view of that world adjusts the same way it would if they were looking or moving in actual reality. The key here is presence, shorthand for technology and content that can trick the brain into believing it is somewhere it's not. Explorations in VR Design is a journey through the bleeding edge of VR design - from architecting a space and designing groundbreaking interactions to making users feel powerful. An article published on the LeapMOTION website in June 2017 states that Art takes its inspiration from real life, but it takes imagination (and sometimes breaking a few laws of physics) to create something truly human. With the recent Leap Motion Interaction Engine 1.0 release, VR developers now have access to unprecedented physical interfaces and interactions - including wearable interfaces, curved spaces, and complex object physics. These tools unlock powerful interactions that will define the next generation of immersive computing, with applications from 3D art and design to engineering and big data. Here's a look at Leap Motion's design philosophy for VR user interfaces, and what it means for the future. In much the same way, VR completely undermines the digital design philosophies that have been relentlessly flattened out over the past few decades. Early GUIs often relied heavily on skeuomorphic 3D elements, like buttons that appeared to compress when clicked. These faded away in favour of colour state changes, reflecting a flat design aesthetic. Many of those old skeuomorphs meant to represent three-dimensionality - the stark shadows, the compressible behaviours - are gaining new life in this new medium. For developers and designers just breaking into VR, the journey out of flatland will be disorienting but exciting. VR design will converge on natural visual and physical cues that communicate structure and relationships between different UI elements. "A minimal design in VR will be different from a minimal web or industrial design. It will incorporate the minimum set of cues that fully communicates the key aspects of the environment." It is predicted that a common visual and physical language will emerge, much as it did in the early days of the web, and ultimately fade into the background. We won't even have to think about it. AUGMENTED REALITY (AR) Eric Johnson of recode.net explains the difference between VR, AR & MR. AR is similar to VR in that it is often delivered through a sensor-packed wearable device, such as Google Glass, the Daqri Smart Helmet or Epson's Moverio brand of smart glasses. The whole point of the term, augmented, is that AR takes the user's view of the real world and adds digital information and/or data on top of it. This could be as simple as numbers or text notifications, or as complex as a simulated screen, something ODG is experimenting with on its forthcoming consumer smart glasses. But in general, AR lets the user see both synthetic light as well as natural light bouncing off objects in the real world. AR makes it possible to get that sort of digital information without checking another device, leaving both of the user's hands-free for other tasks. AR has accelerated thanks to the smartphone game Pokémon Go. The game is mainly designed around maps, letting players find and catch in the real world characters from Nintendo's long-running Pokémon game franchise. When they find a Pokémon, players can enter an augmented reality mode that lets them see their target on their phone screens, superimposed over the real world. MIXED REALITY (MR) MR tries to combine the best aspects of both VR and AR. With mixed reality, the illusion is harder to break. To borrow an example from Microsoft's presentation at the gaming trade show E3, the user might be looking at an ordinary table, but see an interactive virtual world from the video game Minecraft sitting on top of it. As the user walks around, the virtual landscape holds its position, and when the user leans or moves in closer, it gets closer in the way a real object would. This technology is currently far from ready to enter the consumer market. The E3 Minecraft demo wasn't completely honest advertising, and Magic Leap*, a high-secrecy but high-profile company due to investments from Google, Qualcomm and others, has yet to publicly reveal a portable, consumer-ready version of its MR technology. In February, the MIT Technology Review described the company's top hardware as "scaffolding," and a concept video for the eventual wearable device was dubious. Microsoft, meanwhile, has done several public demos but hasn't yet committed to a release date for HoloLens. FORMING & TESTING HYPOTHESES Article by Josh Seiden, April 2014, published on Lean UX Start with a hypothesis instead of requirements Write a typical hypothesis Go from hypothesis to experiment Avoid common testing pitfalls Topics: Personas Lean UX Design Teams Usability Testing It's easy to talk about features. Fun, even. But easy and fun doesn't always translate to functional, profitable, or sustainable. That's where Lean UX comes in. It reframes a typical design process from one driven by deliverables to one driven by data, instead. Josh Seiden has been there, done that and he's going to show how to change our thinking. The first step is admitting to not know all the answers; after all, who does? Write hypotheses aimed at answering the question, "Why?", then run experiments to gather data that show whether a design is working. START WITH A HYPOTHESIS INSTEAD OF REQUIREMENTS • Test your initial assumptions early to take risks out of your project • Focus on ideal user or business outcomes, not which features to build • Write a typical hypothesis • Create a simple hypothesis with two parts • Decide what type of evidence you need to collect • Go from hypothesis to experiment • Design an experiment to test your hypothesis and keep that test as simple as possible • Hear examples of Minimum Viable Products (MVPs) others used to test hypotheses • Avoid common testing pitfalls • Don't overwhelm yourself by trying to test every idea. Just test the riskier ones • Break down the hypothesis into bite-sized chunks you can actually test Don't know what a hypothesis is, why it benefits UX designers, or how to write one the question, whether features are missing and, if so, which users actually need them? Are tired of creating deliverables that don't make the kind of difference you'd want them to? Think there must be a data-driven way to design-one that isn't based on guesswork yet doesn't replace the designer's intuition.

  • 2024 web design trends: User-Centric Experience • Immersion & Interactivity • Bold Typography + Playful Touches.

    The digital landscape is ever-evolving, and web design development is at the forefront of this change. In 2024, we see a focus on user experience (UX), immersive elements, and a touch of the unexpected. Here, we'll explore the hottest trends, the major players shaping them, and the trendsetters pushing boundaries. The indivisible pillars of design continue to hold it all together on their Atlas-like shoulders: A User-Centric Experience: The Age of Elegance and Efficiency Immersion and Interactivity: Beyond the Flat Screen Bold Typography and Playful Touches: Websites with Personality A User-Centric Experience: The Age of Elegance and Efficiency At the heart of modern web design lies a user-centric approach. Clean lines, intuitive navigation, and fast loading times are no longer optional but are the cornerstones of a successful website. Think of it as the difference between navigating a well-organised department store and a cluttered maze. Companies like Google, through their Material Design principles, champion simplicity and functionality. Material Design emphasises elements like bold colours, responsive layouts that adapt to any screen size, and clear calls to action. This focus ensures a smooth and engaging experience for users across all devices, whether browsing on a desktop computer, a tablet, or a smartphone. Furthermore, advancements in web development frameworks like React and Angular allow designers and developers to build complex functionalities without sacrificing speed or user experience. In essence, user-centric design isn't just a trend; it's the foundation for any successful web presence in today's digital age. Immersion and Interactivity: Beyond the Flat Screen The web is becoming more than just static pages. Three-dimensional (3D) elements and interactive features captivate users and blur the lines between the physical and digital worlds. Imagine browsing a furniture store's website and being able to virtually place a 3D model of a couch in your living room to see how it looks. Major players like Adobe, with their Creative Cloud suite, provide designers with the tools to craft these immersive experiences. These tools allow for the creation of high-quality 3D assets, animation, and interactive elements that can be seamlessly integrated into websites. The possibilities are vast—navigating a product in 3D space, interacting with data visualisation to explore complex information, or even taking a virtual tour of a new restaurant location. These are no longer futuristic concepts but design trends that are shaping the way we interact with information online. This focus on immersion is being driven not just by technological advancements but by a user base hungry for engaging and interactive online experiences. Bold Typography and Playful Touches: Websites with Personality Similarly, playful elements like custom cursors and micro-interactions (subtle animations triggered by user actions) add a touch of whimsy and delight to user journeys. Think of a cursor that changes into a butterfly as you hover over a link or a subtle animation that makes buttons "pop" when clicked. Here, we see the influence of design agencies and independent creators pushing the boundaries of what a website can be. These agencies and creators are not afraid to experiment with new technologies and design concepts, injecting personality and fun into even the most traditional websites. In a crowded online space, a website with a touch of personality promises to make a user linger and remember the brand. Captivating destinations & Lasting Impressions The web design landscape is a constantly evolving canvas shaped by the tools provided by major players, the innovative spirit of independent creators, and users' ever-changing demands. By embracing these trends, from user-centric minimalism to playful interactions and immersive experiences, web developers can craft websites that are not just functional but captivating destinations that leave a lasting impression.

  • The 3 phases of AI evolution that could play out this century

    Tech entrepreneur Alvin Wang Graylin sketches out a bold new age of AI-led enlightenment underscored by compassion. KEY TAKEAWAYS In their 2024 book Our Next Reality: How the AI-powered Metaverse Will Reshape the World, Alvin Wang Graylin and Louis Rosenberg outline three phases of AI evolution over the 21st century. The third stage could bring the development of artificial superintelligence (ASI). Although such a system would far exceed human intelligence, it would still be influenced by the totality of humanity’s creations, possessing a small but significant part of us within them. It’s clear there’s a lot of fear and misinformation about the risks and role of AI and the metaverse in our society going forward. It may be helpful to take a three-phase view of how to approach the problem. In the next 1-10 years, we should look at AI as tools to support our lives and our work, making us more efficient and productive. In this period, the proto-metaverse will be the spatial computing platform we go to learn, work, and play in more immersive ways. In the following 11-50 years, as more and more people are liberated from the obligation of employment, we should look at AI as our patron, which supports us to explore our interests in arts, culture, and science, or whatever field we want to pursue. Most will also turn to the metaverse as a creative playground for expression, leisure, and experimentation. In the third phase, after 50+ years (if not sooner), I would expect the world’s many separate AGI (artificial general intelligence) systems will have converged into a single ASI (artificial superintelligence) with the wisdom to unite the world’s approximately 200 nations and help us manage a peaceful planet with all its citizens provided for and given the choice of how they want to contribute to the society. At this point, the AI system will have outpaced our biological intelligence and limitations, and we should find ways to deploy it outside our solar system and spread intelligence life into all corners of the galaxy and beyond. At this third stage, we should view AI as our children, for these AI beings will all have a small part of us in them. Just like we possess in our genes a small part of all the beings that preceded us in the tree of life. They will henceforth be guided by all the memes humans have created and compiled throughout our history, from our morals and ethics to our philosophy and arts. The metaverse platform will then become an interface for us to explore and experience the far reaches of the Universe together with our children, although our physical bodies may still be on Earth. Hopefully, these children will view us as their honourable ancestors and treat us the way Eastern cultures treat their elderly with respect and care. As with all children, they will learn their values and morals by observing us. It’s best we start setting a better example for them by treating each other as we would like AIs to treat us in the future. Of course, the time frames above are only estimates, so could happen faster or slower than described, but the phases will likely occur in that order, if we are able to sustainably align future AGI/ASI systems. If for some reason, we are not able to align AGI/ASI, or they are misused by bad actors to catastrophic outcomes then the future could be quite dark. However, I must reiterate that my biggest concerns have always been around the risk of misuse of all flavours of AI by bad-actor humans (vs an evil AGI), and we need to do all in our power to prevent those scenarios. On the other hand, I’ve increasingly become more confident that any superintelligent being we create will more likely be innately ethical and caring rather than aggressive and evil. If we take the right calculated actions in the coming decade, it could very well be the beginning of a new age of prosperity for mankind and all life everywhere. Carl Jung said, “The more you understand psychology, the less you tend to blame others for their actions.” I think we can all attest that there is truth in this statement simply by observing our own mindset when interacting with young children. Remember the last time you played a board game with kids; did you do all possible to crush them and win? Of course not. When we don’t fear something, we gain added patience and understanding. Well, the ASI we are birthing won’t just understand psychology fully, but all arts, sciences, history, ethics, and philosophy. With that level of wisdom, it should be more enlightened than any possible human, and attain a level of understanding we can’t even imagine. A 2022 paper from a group of respected researchers in the space also found linkages between compassion and intelligence. In July 2023, Elon Musk officially entered the AI race with a new company called xAI, and the objective function of their foundational model is simply stated as “understand the Universe.” So it seems he shares my view that giving AI innate curiosity and a thirst for knowledge can help bring forth some level of increased alignment. Thus, you can understand why I reserve my fear, mainly for our fellow man. Still, it certainly couldn’t hurt if we all started to set a better example for our budding prodigy and continue to investigate more direct means to achieve sustainable alignment. We are near the end of the hundred-thousand-year ignorance and aimless toil phase of the Anthropocene epoch and will soon turn the page to start a new age of enlightenment far beyond our dreams. There are many today who are calling for the end of civilization or even the end of humans on Earth due to recent technological progress. If we take the right calculated actions in the coming decade, it could very well be the beginning of a new age of prosperity for mankind and all life everywhere. We are near the end of something. We are near the end of the hundred-thousand-year ignorance and aimless toil phase of the Anthropocene epoch and will soon turn the page to start a new age of enlightenment far beyond our dreams. When we do find a solution for AI alignment, and peacefully transition our world to the next phase of progress, the societal benefits will be truly transformational. It could lead us to an exponential increase in human understanding and capabilities. It will bring us near-infinite productivity and limitless clean energy to the world. The inequality, health, and climate issues that plague the world today could disappear within a relatively short period. And we can start to think more about plans at sci-fi time scales to go boldly where no one has gone before. Excerpted from Our Next Reality ©2024 Nicholas Brealey Publishing. Reprinted with permission. This article may not be reproduced for any other use without permission.

  • A “Francis Bacon of AI art” will emerge. We just haven’t seen that artist yet.

    “I believe that in the future, there will be a Francis Bacon of AI art,” says the art critic Jerry Saltz. Article by Big Think HIGH CULTURE — MAY 9, 2024 Art produced by or with the help of artificial intelligence is more popular than ever, from the record-breaking $432,000 auction of Obvious collective’s Portrait of Edmond Belamy to the overwhelming success of Refik Anadol’s “Unsupervised” exhibit at the MoMA. But one art-world figure decidedly not on board is Jerry Saltz, the seasoned resident art critic of Vulture magazine. Saltz has made no secret of his distaste for AI art, the artists who make it, and the people who flock in line to see or, God forbid, buy it. His scathing reviews have upset many in the tech world, and, in the case of Anadol’s “Unsupervised,” sparked heated back-and-forths on X. “This kind of work, if it were the scale of a regular painting, would be ridiculous,” Saltz tells Big Think. “You’d just laugh at it. It does not have scale so much as it’s big and takes up room. It keeps crowds interested for whole minutes at a time. It gets crowds in.” The 73-year-old critic is well aware of his unpopularity on this front. “They think that it’s art,” he says. “I’m in the minority. I grant that my opinion is 1% of 1% of all opinions and that 99% of the audience loves this kind of art […] I say to the artists: Good for you. You won.”Still, Saltz stands by his critiques, which — self-deprecation aside — may be enlightening if you’re similarly perplexed by the overcrowded space that is AI art, and wondering how to distinguish between the good, the bad, and the ugly. An artificial dream Setting aside specific AI art, artists, and algorithms, Saltz takes issue with the now almost ubiquitous term itself. “I think it’s a fake category that people use to make a handsome product that wows crowds with super obvious, no-brainer ideas, almost always accompanied by romantic, dramatic music and whizbang, gee-whiz scale.” He compares AI art with Norman Rockwell’s, an American painter and illustrator who was commercially successful but critically derided. “A lot of AI art works in the same way a Rockwell works: by telling you the exact story, describing its characters to a T, and telling you exactly what to think and feel. Everyone who looks at it has the same thought: Wow, cool.” Saltz suggests that the tech world’s cultural sway may fuel hype around AI art, particularly through the cults of personality around figures like Sam Altman and Elon Musk. After all, the art world has always been susceptible to hypes and fads. This one may be no different. “It reminds me of how we see all those Instagram posts and TikTok reels of white people lining up to be escorted to the top of Mount Everest in search of their dream, except it’s not their dream. It’s a dream that was given to them.” As a critic who speaks his mind openly, frankly, and at times coarsely, Saltz has been met with plenty of criticism himself. “All artists have strong reactions,” he says, “and God love them for that. They are probably right. I’m a geezer idiot, and younger critics are novice idiots. That’s fine. There’s no problem with that. But in this sector — AI art — I find it to be especially true.” Saltz entered his most publicized altercation with an artist when he reviewed Anadol’s “Unsupervised” for Vulture. Unable to understand why so many visitors were flocking to the Museum of Modern Art, he referred to the exhibit as “a mind-numbing multi-million dollar spectacle,” “a house of cards and hall of mirrors,” “momentary diverting gimmick art,” and a “half-a-million-dollar screensaver.” Anadol responded to Saltz’s review on X, writing, “ChatGPT writes better than you” and telling the critic he “needs to research, understand the medium” before writing about it. Asked about this confrontation, Saltz notes that — while all artists take criticism seriously and at times personally — he has found artists working in the AI space to be particularly combative against those who question the quality of their work. While Saltz says he researches before he criticizes — he created his own NFT before covering the topic for Vulture, for example — it’s worth considering the other side of the story. After all, art criticism often walks a fine line between distinguishing good art from bad art and gatekeeping new movements and ideas on behalf of the status quo. Familiar critiques For example, the negative reception of AI art bears similarity to criticisms flung at Cubists, Fauvists, and other groundbreaking artists from the early 20th century — artists who, despite being ridiculed by established critics, went on to achieve widespread success and acclaim once the rest of society caught up to and began to understand and appreciate what these forward-thinking individuals were doing. “It’s 100% the same pattern,” Anadol said in an interview with Freethink. “Been there, done that. The responses we sometimes get are also similar to the ones that Jackson Pollock and Mark Rothko received. In fact, all the heroes of humanity received similar responses. They opened the curtains, and whenever they did, there was this reaction. Artists are the alarm mechanisms of humanity. We always see things way before.” In the same interview, Anadol said there’s a clear divide between critics who come to his studio and observe him working and those who glance at the finished products in galleries and museums. Although the second approach is not necessarily wrong — some critics, perhaps Saltz included, would argue it makes for better, unbiased criticism — it obscures the fact that much of AI art, like Cubism or Fauvism, is as much about the creative process as the art that emerges from that process. “AI research is heavily focused on trying to make AI as accurate as possible in trying to mimic reality,” Anadol says. “But for artists such as myself, we love to break things. We love to do things that are not normal. We want to see, not reality, but chance, dreams, mistakes, imperfections, hallucinations, to find a new language and vocabulary.” Just as early 20th-century abstract art investigated how people see — how our brains and eyes assemble shape and color into meaningful, emotionally resonant imagery — so does AI art explore the machinations that underlie creative expression: how an artist, human or not, collects, analyzes, and reassembles data to form something original. New forms Another central problem of art criticism is that it’s much easier to tell what makes something bad than what makes something good. Criticism of AI art faces another difficulty here: The genre is still young. It hasn’t been around for long enough to predict how it will develop or, more importantly, be remembered in the future. Still, we can attempt to predict the legacy of modern AI art by looking back on how the art world responded to previous technological breakthroughs, like the camera. Much of the artistic experimentation of the late 19th and early 20th centuries came from painters asking themselves what all their mediums could do that photography could not. In the same way that realistic renditions of people and nature gave way to more subjective expressions of shape, color, and form — actions a camera cannot perform — the AI art of tomorrow will likely focus more and more on things human artists cannot do, such as transforming vast amounts of raw data into compelling visual narratives, or enabling human artists to tweak early drafts of artwork at speeds none of us could ever reach. Conversely, art created by humans is likely to emphasize what algorithms cannot do: love, grieve, contemplate our own biological shortcomings, and aspire to succeed even though such aspirations may be irrational. “I believe that in the future, there will be a Francis Bacon of AI art,” Saltz speculates. “We just haven’t seen that artist yet. They have not yet emerged. Art takes a long time. Painting is still emerging and it’s been with us for 40,000 years. It’s still feeling its parameters. AI is doing that now.” “Right now, everything these artists make has a precedent. They either make a moving, abstract, expressionist painting, or a Jackson Pollock-squiggle thing, or a Walt Disney manga character (boy hormone art). None of it is without precedent. The problem is when some of them say, ‘Look at my new Surrealist work,’ I say: ‘Well, it doesn’t look any different from the old Surrealism.’ Why should I even look at it at all? I’m interested in new forms. Form is the carrier of deep content, not your explanation of what deep content is!” Reaffirming the importance of art criticism, Saltz sticks to his opening words: “I never, ever listen to artists. They don’t know what their art is about. I know what their art is about. Let me get it wrong. If they disagree with me, fine. But I’m of another generation and there is no more of me. My kind does not exist anymore.” Big Think tags: AI, Culture, History Article by Big Think HIGH CULTURE — MAY 9, 2024

bottom of page