8 March 2026
I have mixed feelings about the Lego Smart Brick.
I think it is very nice that they are incorporating audio feedback to your lego pieces, since audio is a
big part of play experience and very under valued. Therefore, I'm glad that lego is thinking and
enhancing this aspect. Also, at least from what i know from the public information they release, they
are actually focusing on play aspect and didn't force AI into it it, just sensors and specific programs,
which i really
appreciate in a time where people put AI in to everthing (including toys which is very problematic).
However, I do feel that this is changing the whole premises of playing with legos, espeically kids.
Since the sounds are
predesigned to specific lego models, which kind of removes creativity part where kids
create their own sounds for their lego pieces. They can imagin whatever they want the lego to be,
through both how they assemble and how they make the sounds. Specific lego set plus predefined sounds
removes this aspect, and then playing with lego kind of become learning how to follow instruaction and
not being creative. Also, all the possible privacy issues (even when there is no AI since it is
connected to the internet), which amplifies the seriousness when it is kids who are using it.
To be honest, with how lego, as a company, is focusing more on specific themed sets (e.g. flower, star
wars ship) and less on random brick sets, it has already been shifting to a less creative track. This
smart brick is just adding to
it. I think the reason is because they are focusing more on adult play now rather than kid play and this
Smart Brick is also more focus on adult play. I think adult play is also important, but we all know this
shift is not because lego think "adults can also enjoy playing!" but more of a financial decision. But,
regardless of the adult play thing, a lot of their marketing materials are focusing on kids play, which
I think is why I have mixed feelings on it.
3 March 2026
I use to be a bit obsess with numbers over feelings.
Quantitative methods are great, but now I think they don't tell the whole story (well the whole
reason
of
why
mixed method is used a lot).
I use to think research should be completelt logical,
rational, and object. However, to me now, I think the emphasis of HCI should be on human, and
human's
subjective experience is a very
valuable thing. We are very subjective beings and our feelings, emotions, and subjective experience
embodies how we see and prefer things. Quantitative methods and statistic analysis with numbers are
very
scientific method and I use to think they seems less bias
and more rigorous. But now I feel that the core of these methods are finding correlation.
The reason why we try to find corrolation is because we want to use such corrolation to prove
casuation.
I'm not saying it never implies causaion, it is more of how I
understand the relationship between correlation and causation. I agree partly to Hume's idea of
causation are correlation that we make sense from our expereience and understanding (I'm not a
philosopher so this is just my very surface level understanding of Hume's idea). So with this understanding, the
casuation we derive from statistcal correlation are also bias in nature.
However, this bias is based on researchers/designers' understanding of thing, so we are the ones to
consider what corrolation choose to proof causation and what casuation we want to proof. To me, this
in
some ways
enlarges the power assymetry between researchers/designers and users (I'm pretty sure I'm not the
first
person who though of this tho haha), as we are the ones who have the power to decide what is
causation
and what is not, and then use this causation to make design decisions. There will always be power
assymetry between researchers/designers and users, but the more we rely on numbers, the more unequal
it
is.
I do need to acknowledge that quantitative methods are more suitable in some cases, and qualitative
analysis is still
influenced by researcher/designers' personal bias too, but I think the difference is in
transparency.
Quantitative data can also be open and transparent,
but the difference is in how visible the reasoning is. The leap from correlation to causation in
statistical analysis
is often hard to trace, and it is difficult to see where the researcher's assumptions comes in, even
for
researchers/designers themselves. With
open qualitative methods and data,
the user experience are directly documented and process is more visible, which makes it easier to
reflect on, challenge, and interpret
differently.
There is always going to some bias and power assymetry which is not necessarily bad, but I think it
is
important for us to think how to be aware of it and reflect on it.