[Cuis-dev] Update: Alt+Return as a full screen toggle shortcut
Mauro Rizzi
mrizzi at fi.uba.ar
Sat Dec 19 05:48:16 PST 2020
Responsibility models aside,
Maybe we could have an object somewhere that contains a collection of
possible keyboard event triggers and the action they should perform?
You know so if you want to add a new keyboard event you could just tell the
object to add it to the collection and when you want to check if a
combination should trigger something you just ask that object.
I think it would make sense from an usability standpoint since it would
allow the user to define system wide events without touching system class
methods or having to understand the chain of responsibility on detecting
the input (and it would also allow you to define an erroneous trigger
(accidentally of course) without making it so you can't use the keyboard
anymore until you revert your change).
Cheers!
*Mauro Rizzi*
El sáb, 19 dic 2020 a las 6:59, Luciano Notarfrancesco (<luchiano at gmail.com>)
escribió:
> I think the hand is a metaphor for the actual users’ hand. Keyboard events
> also go through the hand. In my opinion this emphasises the tangibility of
> morphs, and having the option to create new kinds of hands enables
> experimentation that could lead to interesting places (as it happened in
> Squeak with MorphicWrappers, I made a new hand that implemented the “typing
> on air” thing).
>
> I’m not sure how all this would work with touch screens, tho... or what’s
> the idea for HandMorph in Morphic3. But I don’t like very much how we’ve
> been hooking events and implementing shortcuts in the event classes
> themselves because I need to be able to redefine that behaviour and it’s
> not all in one place, and it’s less flexible, and sometimes it’s not easy
> to find the code that I need to change to redefine the behaviour.
>
> On Sat, 19 Dec 2020 at 3:26 PM, Philip Bernhart <philip.bernhart at posteo.de>
> wrote:
>
>> Hello all.
>>
>> Luciano Notarfrancesco via Cuis-dev <cuis-dev at lists.cuis.st> writes:
>>
>> > Btw, I think the Morphic way to implement things like this is with
>> hands.
>> > User input should always go through the hand, and a particular kind of
>> hand
>> > can implement global shortcuts (for example to open a browser, alt-. to
>> > open a debugger, alt-tab to switch between windows, opening of halos,
>> etc).
>>
>> Interesting insight. But why the hand? A hand was always as far as I
>> have seen in morphic the mouse pointer, which could have many instances.
>>
>> A hand in these days isn't necessarily a mouse pointer anymore, but
>> could be a multitouch input event, a gesture or many remote users living
>> on different machines.
>>
>> Wouldn't that be more the world in which the morph is running in as a
>> world defines the global context, hence the word "world"?
>>
>> Philosphical question: do we even know at this point what is supposed to
>> be "true" morphic after the decades adapting a lively system and porting
>> it from original self to Squeak and then to the needs of each Squeak
>> derived system?
>>
>> I for example only know morphic from the Self movie, where you could
>> spawn morphs and adapt them according to your needs.
>>
>> My gut feeling is to do that which in turn simplifies the system and
>> bears its own weight.
>>
>>
>> Cheers,
>> Philip
>>
>> --
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cuis.st/mailman/archives/cuis-dev/attachments/20201219/c32d9e4f/attachment.htm>
More information about the Cuis-dev
mailing list