Accessibility has always been a topic that interested me. Writing standards-based HTML brings a baseline level of accessibility with it.
But a lot of times the interfaces that we make are a bit more complex than what you can do with standard HTML. Consider this design for a search dropdown.
As a UI designer with front-end skills you are always in between two worlds.
Do you follow the logic of the standards-based implementation or do you do your own thing? For some things like <select> dropdowns it is obvious that there are massive benefits to using the standards-based method.
In slide 24 to 47 of this old presentation about UI design you can view a whole argumentation as to the benefits of not customizing dropdowns. (This is a presentation from 2012 and I am still writing about this – Oops)
Knowing all of the benefits of not customizing dropdowns, I still often customize dropdowns in the interfaces I create. A standard <select> element simply isn’t good enough for some of my use cases.
So we have to provide a custom interface.
Which in turn is a painful choice, because we are basically throwing away the built-in accessibility of the system. We’re basically saying: sorry, it is more important for this group of users to work with this than it is to be accessible.
The knowledgeable person will point me to WAI-ARIA attributes; but because I don’t use a screenreader in daily life, it is quite hard for me to test a custom control for accessibility.
In my own presentation from 2012 I argued that engineers from Apple, Google and the likes have worked hard on providing interfaces. Fast forward 7 years later and there is little evolution in custom form controls. Just last week I discovered Safari still does not support <input type="date> on the desktop. The current situation is quite painful to say the least.
I’m pretty tired of some small-time bullshit that I see on Twitter sometimes. Some people really take their self-promotion too far.
Today I was reminded of someone, who, whenever I met him in real life thought he was a cool guy. I respect most of his writing and opinions. In fact I probably learned a thing or two from reading his blog.
But man… all the online communication I ever see of this person is about promoting his company, his new video web series, whatever.
I am a business owner myself. I understand the need to do sales. Sure, I drop my company name when someone asks for design help on LinkedIn.
But I can’t help but think… too much is too much. If the only thing I hear from you on social media is incessant self-promotion, I can’t find the mute button fast enough.
There are two main schools of thought in CSS frameworks.
There is the old guard (Foundation/Bootstrap) that is good for entry level work/learning but creates problems on bigger projects due to style leakage and not enough strictness in namespacing.
Then there is a new logic with lots of utility classes (which might get recompiled to a file that only uses these classes). Frameworks like TailwindCSS. The logic behind this might be sensible but it is too complicated, especially when considering reusing the utility class logic across multiple projects (answer: it doesn’t work).
Next to this there is a clear movement to “componentisation” in CSS. The CSS workflow used around MVVM frameworks (such as Angular) and React/Vue often have their ways to prevent style leakage. There are various projects like styled-components that make sure that styles don’t leak outside of components.
However these projects assume a full-on Javascript-based development environment and are too heavy handed in general. They re-implements parts of CSS and forego the strengths of native CSS.
It’s very painful to see a re-implementation of CSS in JS that only supports parts of CSS.
I have always liked a “pick and choose” approach for components. In Bootstrap for example you can take parts out of it that you don’t use (Jumbotron, Badges etc.) and the framework will still function.
However, Bootstrap suffers from overly complex code. Some parts of Bootstrap feel like somebody wanted to use every language feature of Sass just because they could.
In the end if you are trying to debug something you will end up reading the source that generated the code. If the source code is too complex that is no good.
One can wonder: why use a framework at all? There are of course several reasons. In bigger projects you never have enough time to to the deep work that is necessary to provide a quality user interface, especially when it comes to the more difficult components like datepickers or modals.
A framework that is already focussed on performance and accessibility and used in the right manner will help a project tremendously.
In my opinion, there is space in the “CSS framework market” for a great BEM/ITCSS framework that is well documented, namespaced and tested. Something that has good documentation and well-tested components. What do you think?
This is the making of blogpost in which I discuss which technical explorations I went through to get to the end result.
Making these kinds of websites is not a new thing for me. I made one in 2008, 2009 and 2012 as well – but these are offline by now. I like to use these projects to experiment with some new things in the (usually) quiet period of the Christmas holidays.
Content data sources
When I create HTML prototypes for work I’ve gotten into the habit of creating JSON data structures whenever there is repeating content.
This is pretty much a given when using the JAMstack.
In last year’s “Best of” I chose to pull data from an external source: Airtable. While it was cool I eventually realized this was pointless since now the data is hosted on Airtable and now I had Airtable as a dependency.
Using Airtable as a source also generated overly complex Javascript objects (example). This was good practice to learn how to manipulate them (practice I used during the year when experimenting with the Figma API) but ultimately not a great solution.
This year I tried some alternatives. One of them was was to use a headless CMS. I tried to use sanity.io to create data structures, with the underlying idea being to combine structured content with schemas that can be validated.
While this tech solution is really interesting I gave up it was just too much work to get into something totally new for a one-off thing.
I ended up writing the JSON structure by hand. This is better for the use case since there’s so few records in this “database”.
This is much cleaner than the crazy JSON files that came out of Airtable.
However, manually editing JSON files is not a “client proof” solution, so something tells me I need to explore the headless CMS direction a bit more in 2019.
The way to render the data looks like this:
<ul class="c-cards">
{%- for item in albums -%}
<li class="c-card">
<div class="c-card-image">
<img src="/images/albums/{{ item.fileName }}">
</div>
<div class="c-card-body">
<h2 class="c-card-heading">{{ item.artistName }} - {{ item.albumName }}</h2>
<p class="c-card-meta">{{ item.releaseDate }}</p>
{%- for category in item.categories -%}<span class="c-card-category">{{ category }}</span>{%- endfor -%}
</div>
</li>
{%- endfor -%}
</ul>
The templating language used in the above example is Nunjucks.
Content first
One of the conclusions of the 2017 edition was that I should worry about the content first instead of the tech. So for the 2018 I prepared all of the content in Notion and wrote most of it before even touching any technicalities.
Notion as an application has been one of the revelations for me in 2018. It’s a knowledge base – a note taking app – but also so much more. I used Notion to manage the todo’s for this project; to work on the actual content; and to write this very blog post.
In the process I learned how to add columns to pages in Notion, and a few shortcuts.
When I felt like I got far enough content-wise, I exported the content to markdown using Notion’s export function to start work on the actual website.
11ty (eleventy) instead of Bedrock
For the 2017 edition I worked with our static site generator Bedrock to create the templates for the Best of website.
While I obviously like working with Bedrock, the object of this project is to learn something new, so I wanted to give the new static site generator 11ty a try (thx Jérôme for the tip!).
11ty is pretty unique as far as static site generators go, since it tries to get out of your way by putting less restrictions on how you want to structure your content and in which format you write it than other static site generators.
This has its good and bad parts. In a way it is very flexible. In another way it’s also very ambiguous about what is possible and what isn’t.
Theoretically, if you want to create parts of your content in markdown, parts in Pug and parts in Nunjucks you could if you wanted to. I really liked working with 11ty but in the end it wasn’t as flexible as I wanted it to be. Perhaps it might be in the future, since the project is at version 0.6 – not a 1.0 yet.
In the process of creating the Best of 2018 website I created a Gulp 4 based workflow around 11ty which I might share publicly in the future. It’s not perfect yet but it works. I discovered once again you can lose a lot of time trying to optimize workflows. First you think setting up Gulp is simple and hours later you’re still debugging the perfect sync between incrementally generating files and injecting CSS/JS changes live.
Design and CSS
Design-wise I had started on a concept design in October. I didn’t get very far because I decided I would keep a similar design as last year’s Best of.
Having decided that it seemed pointless to design things in a design app.
When I started coding I did change the technical underpinnings.
I used Mono’s internal BEM/ITCSS framework (called Jungle) which has matured over the past year. This meant I had to write very little custom code.
Because of the technical underpinnings I literally spent 1 hour on making the whole thing responsive.
Conclusion
In a lot of ways the end result is exactly what I made last year, but the technical underpinnings have changed.
In the process I took the time to look for tools and techniques that can remain staples of my workflow for years to come. Just like last year there’s going to be techniques that I will use immediately on my first day of work in 2019, and there will be things that I’ll probably never touch again.
All tech nerdery aside, I hope you enjoy the content itself. They’re my favorite things I watched and listened to this year. I wish everyone a splendid New Year’s Eve and see you all in 2019.
Het einde van het jaar brengt vaak een moment van reflectie met zich mee: wat heb ik dit jaar juist gedaan, en waar gaat het naartoe?
Professioneel had ik een zéér goed jaar. We zijn met Mono bezig met de juiste UI/UX design projecten voor de juiste klanten, en daar ben ik blij om. Ik heb eindelijk kunnen werken aan een groot project in de medische sector waar ik al lang van droomde; en ook voor bestaande klanten in o.a. supply chain deden we interessant werk.
Het team is gegroeid met een heel aantal nieuwe gezichten wat een totaal andere dynamiek met zich meebrengt. Ik heb voor mezelf de transitie moeten maken van designer naar (deels) manager, van zelf het beste proberen te doen naar anderen het beste te kunnen laten doen. Dat betekent de juiste voorzetten geven, inzetten op persoonlijke groei en een omgeving creëeren waarin het beste werk kan gebeuren.
Ik zeg deels manager, want toch kan ik het niet laten om zelf te blijven ontwerpen en coden. Ik ben nog altijd erg geïnteresseerd in mezelf te blijven bijscholen over nieuwe design tools, technieken en de achterliggende technische aspecten. Ik gaf een workshop rond Figma en was heel erg bezig met het ontwerpen en documenteren van allerhande design systemen.
Ik heb nog altijd de droom om vlot wat Javascript en Python uit mijn mouw te kunnen schudden, maar ik vrees dat tenzij ik een radicale carrièreswitch naar developer maak het zal blijven bij het ietwat hobbyistische. Het blijft een vak apart.
Vanaf september — door de gemeenteraadsverkiezingen — kwam voor mij het thema politiek naar boven. Dit is de laatste maanden een actieve interesse geweest. Voor het eerst in jaren heb ik een abonnement op de krant genomen en lees ik actief het nieuws. Ik ga net iets verder en lees ook de bronnen waar het nieuws zich op baseert, zoals het recente verbindingsakkoord in Antwerpen. Ik vind het essentieel om een eigen mening te kunnen vormen op basis van wat er écht gebeurt en niet enkel op basis van berichtgeving. Een mening gebaseerd op feiten, en niet op (vaak slecht) onderbouwde meningen of oneliners op Twitter.
Eén van mijn professionele doelen is om het gebruik van computers menselijk te houden. Al te snel wordt technologie als een heilige graal gezien die allerhande menselijke problemen wel zal oplossen. Veel te vaak creëer je door het gebruik van het technologie een nieuw probleem dat het oude gewoonweg vervangt. Ik wil een juist gebruik van technologie, op een menselijke manier. Je kan technologie menselijk maken door aan de gebruikerservaring te werken, maar als de premisse verkeerd is, dan kan je nog zo lang aan de gebruikerservaring werken, het gaat niet helpen. En die premisse is vaak een politieke beslissing.
Daarnaast heb ik mij in mijn vrije tijd geamuseerd met films en met koken. De cinema werd bijna een vaste gewoonte – met de podcast als leidraad – tot het filmaanbod rond de nazomer opeens opdroogde. In de keuken heb ik een nieuwe waardering gekregen voor kwalitatieve ingrediënten, goed materiaal en eenvoudige gerechten. Ik kijk op een hele andere manier naar koken en bij uitbreiding naar eten dan enkele jaren geleden.
Met de klimaatcrisis in het achterhoofd zullen we anders moeten gaan leven, dat is duidelijk. Welke vorm dit zal aannemen weet ik nog niet. Als vleeseter, autobezitter en vliegtuiggebruiker ben ik mij bewust dat ik deel ben van het probleem. Ik kan me niet ontdoen van de indruk dat vele zogezegd “ecologische” iniatieven niet zo ecologisch zijn als we zouden denken. Een punt om te bestuderen in 2019.
Ik wens iedereen rond deze periode genoeg rust, tijd met familie en vrienden, om dan na een fijne oudejaarsavond 2019 in te gaan met nieuwe energie en volle goesting. Santé!
It was top of mind in this design project recently where a lot of the branding was orange, and the problem was that white text on orange buttons didn’t pass WCAG 2.1. The project was government-related and for a large audience so I felt like I had an extra responsibility to do things right.
In a design like this the button at the bottom would totally fail accessibility tests:
I wanted a simple way to switch the orange in my designs to a darker orange (a11y orange) in order to pass WCAG requirements.
Basically I wanted to write CSS like this… (warning, pseudocode, this does not actually work!)
The @media (prefers-high-contrast) would check if high contrast mode is turned on in the user’s OS. For example in macOS, it looks like this:
Currently, there is no standards-based way to detect high contrast mode. There is an old implementation that only works on Windows (and I suppose only in IE) – which ties into the Windows theme settings I believe:
I wished this was available in a standardized way. Now I often have to choose between accessible or ugly.
Some applications try to solve the high contrast problem with their own implementation. In KBC Touch (banking app) this is a global on/off switch:
In teamweek there are more fine-grained settings, that also help with other disabilities (e.g. color blindness).
But in my opinion a setting per app is kind of pointless. What’s funny is that in the above Teamweek example the control to check the high contrast doesn’t have high enough contrast itself.
It’s super great that the team of the two above apps thought about accessibility, which makes them already superior to most apps out there. But local settings in apps for teams that think about it and go great lengths to make a custom implementation, that’s not good enough.
We need a standards-based way to tie into an OS system setting; and the persons suffering from low vision need to be educated to put this setting on. It will help them across the board.
I filed some bugs and participated in discussions but I need your help to bring this to the attention of browser vendors.
If you have an account on Github and/or Bugzilla please participate in the discussion or voice the need for this from the standpoint of web developers.
You can also share this post, hopefully it can reach the right people that way.
KBC is de 1e bank om de aanschaf van treintickets toe te voegen aan de mobiele app. Ik kan dit alleen maar toejuichen. De kwaliteit van de mobiele KBC app is erg hoog. Ik hoop dat ze dezelfde kwaliteit kunnen bewaren in deze ervaring.
Ik meen dat het al langer mogelijk is om trein tickets te kopen via derde partijen, maar dan voor internationale ritten, niet voor nationale.
Mijn treinticket-ervaring, ondanks de vele campagnes rond de NMBS app en hoe gemakkelijk hij te gebruiken is, is nog altijd grotendeels op papier. Dit is een rechtstreeks gevolg van de prijszetting van de Rail Pass.
Voor diegenen die het niet weten, een Rail Pass kost 76 euro in 2e klass en 109 euro in 1e klas. De Rail Pass is goed voor 10 ritten van eender welke bestemming in België naar eender welke bestemming in België.
Een vaste prijs per rit dus, op voorwaarde dat je binnen het jaar je pass “opgebruikt”.
De praktijk van mijn treinwezen is dus een pen bijhebben, mijn route op op een Rail pass en het bonnetje of de Rail Pass bijhouden voor mijn boekhouding.
Allemaal goed en wel maar ik probeer net zo weinig mogelijk op papier te hebben.
Nu, de kans dat ik dit systeem digitaal kan gebruiken (ook al zou ik het wel willen) is nogal klein, want dit zou voor mij eigenlijk betekenen dat alles duurder wordt.
Er is namelijk geen goed prijsmodel voor iemand die vaak in meerdere Vlaamse centrumsteden moet zijn. Daarom gebruik ik een Rail Pass. Digitaal kan je enkel maar een enkel ticket kopen.
Een abonnement zegt u? Er is, ondanks mijn frequent treingebruik, geen abonnement dat goedkoper is dan de Rail Pass te gebruiken. Zodus zou ik een verwelkoming van de Rail Pass verwelkomen. Ik zie dit dan als een soort interne counter van hoe vaak je de trein hebt genomen met een korting als je vaak genoeg de trein hebt genomen.
Want dit is effectief het principe van de Rail Pass – je krijgt een korting omdat je belooft tenminste 10 keer de trein te nemen.
Voor korte ritten heb ik nu geen goede oplossing; ik moet een bonnetje gaan halen en dat kost loket/machine tijd. Want ik ga nu ook geen 7,6 euro betalen om de trein te nemen van Antwerpen Berchem naar Antwerpen Centraal, of van Berchem naar Mechelen.
Een ander “probleem”. Ik gebruik ook al jaren een combinatie tussen een 1e klas Rail Pass en een 2e klas Rail Pass. Zo kan ik dynamisch wisselen tussen de klasses afhankelijk van hoe druk het is, welke trein het nu juist is (1e klas comfort verschilt per trein) en afhankelijk van hoeveel zin/energie ik heb om te werken op de trein.
Al dit gezegd zijnde, zou ik het management van de NMBS willen aansporen om de formule van de Go Pass/Rail Pass te digitaliseren.
I wanted to bind ⌥+⌘+T to be able to use macOS’s tags feature using a keyboard shortcut in Finder.app.
The way to set it up is super vague, so I figured I would share that you need to do to make it work.
First, set it up in System Preferences > Shortcuts. Attach the shortcut to Finder.app. Make sure to use a … (ellipsis) character instead of typing 3 periods (…).
Then, make sure that in Finder preferences you have no favorite tags, otherwise it will not work. In the screenshot below you will see that there are no favorite tags.
This seems like a bug in OSX so I filed it as a Radar.
I tweeted “Just self-host your things. External services can change the rules at any minute and they will.”
This was mostly in response to people complaining about Flickr’s recent policy change, which limits free accounts to show the 1000 latest photos.
It’s also sort of a retort against Medium, a service I have been using on and off but that I should really stop using since the end user experience of people visiting articles has become pretty shit. For a service that once was focussed on reading they can throw an awful lot of cookie-like banners in your face. And the promise of paying writers is not happening at all, but that’s a different discussion.
The point to self-host is mostly a point about taking care of your digital belongings.
I got responses that argued not everyone is as technical (as me) – or that most people don’t care to learn all of these things. I absolutely understand there is a learning curve to taking care of you digital belongings, but if you really care about what you make, you will either make the effort or pay someone to do the “digital conservation” for you.
This can be as simple as paying a company you trust (i.e. Apple) for a service that you think will be around for years (i.e. iCloud). Or if you like to pay with your data you can do the same with Google.
But if you really care about digital conservation while sharing your content at the same time, you should have a combination of local backups in different places and this thing called a website where you share your things.
The thing is, if you are not paying for a service, you are effectively at the grace of their policy change, and if you do pay for a service, it’s still the same, except there’s a bigger chance that the service in question will remain in existence.
But in the end, things will shut down, companies change, or they start to suck over the years. I recently cancelled my CloudApp subscription because the service just got worse over the years instead of better. I think most services progressively worsen over the years instead of getting better.
Also, instead of worsening, they can also just stop being in business. How many apps and social networks have died over the years? There’s a graveyard of apps that just shut down. If you depended on any of them heavily, you had to do the work of getting your data out, if it was even possible.
We recently started using Notion and one of my first worries was how to get data out of Notion. Thankfully there is a export option – otherwise I would probably not use the service.
Now, some of these things I am writing about might make you think that I’m a hardcore digital archivist.
I agree that over the years I have taken measures that are probably a bit techier than others to try and do some “digital preservation”.
I changed my mind about what needs to be online at some point in time (i.e. I once believed that since Cool URIs don’t change I should keep everything I ever published online) but regardless the way that information can be preserved over years is a topic of interest to me.
I quite like that I can link to photography.johanronsse.be – a website that I made 6 years ago – and know that it’s still there.
On the other hand I was quite pissed that a static website I made hosted on Github.io just died because Github changed their internal logic. My fault for depending on Github to host my stuff.
What I want to say in a nutshell is – if you depend on an external service, realize it can shut down at any second. Have local backups. Make redundant backups. Take care of your digital belongings.