[Cuis-dev] Interesting thoughts about Color

Phil B pbpublist at gmail.com
Mon Apr 29 11:11:05 PDT 2019


Ken,

On Mon, Apr 29, 2019, 9:36 AM <ken.dickey at whidbey.com> wrote:

> On 2019-04-28 21:39, Phil B wrote:
>
> >> I tend to think of immutable, functional data structures.  So I don't
> >> expect 'foo darker' to return a mutable object.
> >
> > OK, now I understand where you're coming from.  I'd simply point out
> > that in the vast majority of cases, Smalltalk code has not been nor is
> > it currently immutable or functional 90+% of the time.
>
> Your code does not use numbers?  Or #symbols?  ;^)
>

Bad examples for different reasons, IMO.[1]  I'd think a better comparison
would be Point.  I think Point is both a better immutable implementation
and (for the most part) uses better names as the names and their meanings
are more broadly useful.


> You want to use '0.3 asSin' rather than '0.3 sin'?  ;^)
>
> Natural languages have irregularities and dialects which we learn.  I
> find Interlingua easier to read than Esperanto, even though it is less
> regular.  Perhaps it is just me.
>

The last thing I want is for Smalltalk to adopt the ambiguity of languages
used in human communication.  AppleScript was enough to convince me that
this is not the way to go.

>
> I do change things with a long history.  But #darker I understand
> immediately.
>
> Scheme uses '!' to indicate mutators/setters. Would you propose
> #at:put!: or some such?
>

Mainly I just want immutable objects to feel immutable.  (I.e. to not have
setters that you need to 'just know' not to use after instance creation etc)

>
> OK, I am being a bit difficult here, but pull us along.  Change costs
> something.  Perhaps some sample code would show the benefit and let me
> be embarrassingly wrong here.  Hey, it would not be the first time!
>

I have no problem with change.  If we want immutable classes, let's make
them as truly immutable as possible within the limits of the VM.  If we
want to make things functional, then let's do that.  But consistently and
with purpose rather than in a partial, ad hoc fashion.

>
> Cheers,
> -KenD
>

[1] a) I think math in general is a bad example because it's probably the
most well understood DSL in existence.  It obviously predates computer
languages and is baked into (every?) human language. I.e. it's rather
unique.

b) To me, the usage of symbols in the image is a tragic historical mistake
in that the same symbol table is used in an ad hoc way for class names,
selectors, enums and strings.  So when I see #x in the image, is it a meant
as a selector, an enum or a unique string?  Unfortunately, the answer is
'yes' and to disambiguate you have to read the code.  Its another variant
of semantic overloading I really don't care for.

Thanks,
Phil

>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.cuis.st/mailman/archives/cuis-dev/attachments/20190429/a9feb533/attachment.html>


More information about the Cuis-dev mailing list