It looks OK to me:
(define (time-to-move from-pos to-pos)
;; Calculates the total time to move between two positions,
;; assuming rotation and ascension can occur simultaneously.
(max (time-to-rotate (position-azimuth from-pos)
(position-azimuth to-pos))
(time-to-ascend (position-elevation from-pos)
(position-elevation to-pos))))
indeed. Note that's a function declaration in Common Lisp (yours is Scheme):
(defun time-to-move (from-pos to-pos)
it's like the author didn't understand accessors? They don't need the with-accessors. They also might not need with-slots, had they had an accessor for last-duration.
(with-slots (last-duration) antenna
(setf last-duration total-time)
=>
(setf (last-duration antenna) total-time) ;; given last-duration is an accessor: both a getter and a setter.
Yeah, the total run-scan function could be fairly short:
(defun run-scan (antenna scan)
"Perform a scan on this antenna. Return a scan log."
(let* ((slew-time (move-to antenna (scan-position scan)))
(on-source-time (delay antenna (scan-length scan))))
(setf (antenna-last-duration antenna) (+ on-source-time slew-time))
(make-scan-log scan slew-time on-source-time)))
Yet the author injects needless complexity with macros and claims "I think it’s possible to write clearer Lisp code, I just don’t know how." I don't get it.
Yeah, I was going to say. The simplicity and readability of Lisp code is inversely proportional to the use of macros.
There's a lot in this article that I really identify with: the preference for a functional style (even if I'm just not smart enough to do the whole ultra-abstract symbol soup Haskell thing); the comfort I feel leaning on the compiler; "it also always feels like I’m tricking Lisp into doing what I want, rather than simply expressing what I want" feels like it came right from my brain.
God, Lisp...the core language isn't exactly that interesting in this day-and-age. Dynamic typing, garbage collection, anonymous functions, this has been the bread-and-butter of Python, JS, Ruby, etc. developers for like 20 years now. It's got some really powerful features like conditions and CLOS. But then the bulk of the language and library built on top is such a mess: it's missing so much basic functionality, but also has so many niche functions that can be configured every which way with niche keyword arguments, and they all turned out to be evolutionary dead-ends and much worse than what the rest of the world settled on (actually CLOS probably falls into this category too). I think it's this, more than anything, that makes programming in Lisp feel like an act of trickery rather than clear thinking.
But I'll also say that I've been hobby programming in Lisp a bit recently and, despite that, I've been finding it immensely pleasurable, the first time I've really enjoyed working with a computer in years. The highly interactive, image-based workflow is just so much smoother than my day job, where I'm constantly jumping between VSCode and Chrome and a console and manually rebuilding and sitting around waiting...
Macros may be a double-edged sword - they encourage monstrosities like LOOP, rather than building more powerful/regular/comprehensible language features like iterators or comprehensions. BUT when paired with that interactive development experience, it really feels like you're in a dialogue with the computer, building out a little computational universe.
I learned Common Lisp during the life of this HN account, and there's an amusing trajectory in my comments, from "I hear Lisp is beautiful and I'd love to learn it", through "Wait, Common Lisp is kind of a horrendous mess", to "OK, I get this, this is very cool".
Common Lisp genuinely expanded my thinking about programming, so I find this article's poetry analogy very apt. But part of this growth was clarifying my own preferences about programming, and some of CL's greatest strengths - extremely clever runtime meta-programming, CLOS, etc - are not actually things I want to work with in a code base at scale.
I also think the UX for CL, outside the commercial Lisps, is pretty grim. CL would greatly benefit from a rustup-cargo-like system with good dep management and sane beginner defaults, like wrapping the SBCL REPL in rlwrap. Haskell is more beginner friendly than CL right now, and that's quite the indictment.
> good dep management
you can see https://github.com/ocicl/ocicl and https://github.com/fosskers/vend (newer)
> sane beginner defaults
agree. CLISP's REPL is beginner friendly, but we shall not advise it today. I'm doing this: https://github.com/ciel-lang/CIEL/ Common Lisp with batteries included. The terminal REPL has completion and syntax highlighting. You can run scripts easily. You get many common libraries. (beta)
>God, Lisp...the core language isn't exactly that interesting in this day-and-age. Dynamic typing, garbage collection, anonymous functions, this has been the bread-and-butter of Python, JS, Ruby, etc.
CL still got symbols, the reader (and its macros), gradual typing and user available runtime compilation (compile and compile-file).
I find the core language itself near perfect (mostly the historic stuff like threads/atomics/unicode missing, the whole divide between normal CL and CLOS and lack of recursive and parametric typing") but the standard library quite baroque and lacking; still infinitely more serviceable than C's, though.
(to be clear those historic stuff are present in today's implementations)
> CL still got symbols, the reader (and its macros), gradual typing and user available runtime compilation (compile and compile-file).
TypeScript has all of these, too.
JS Symbols are like CL gensym and CL symbols (interned strings) don't exist in JS.
Normal TS doesn't expose normal macros or reader macros to the user.
JS runtimes don't have the image introspection ability of CL.
TS types are removed before the JIT and don't actually impact performance (not to mention they encourage polymorphism which is actively BAD for JS performance). CL type hints are use by the compiler to actually speed up the output code (quite dramatically I'd add).
who is using macros in ts ?
Really? You can `(compile nil my-lambda)` without having to write a JIT yourself?
I meant reader macros, by the way: can it do that?
; A comment
(set-macro-character #\% (get-macro-character #\;))
% Also a comment
Give Janet a try, it's a breath of fresh air with well-realized primitives and a solid standard library. For a decade now I've been immersed in the worlds of Clojure and Nim, where macros often get abused to no end. I've written 0 Nim macros, and a number of Lisp macros I could count on one hand.
I agree on macros: I have been using Common Lisp since 1982 and I almost never write macros. To be fair, I like my Common Lisp code to be readable (to me) and don't care as much about conciseness.
In any case, I think every developer should have a few languages that they really enjoy using.
The Smalltalk implementation is off. Scan is a verb not a noun and I think the readability the author was seeking would be more plain by sending messages to an antenna to ask it to move it's position and report back it's scan times and such.
Oooo, such pretty solutions here. What do people think of this APL option?
(destEl-sourceEl)⌈destAz-sourceAz
Honestly, though, we don't really know what any of the reference implementations do. The core structure structure is just (max (f x y) (g z w)), and we have to simply guess at the meaning of f and g by their names. The above APL takes a similar liberty, but instead of a fiat declaring, we just a fiat declare that our data is sufficiently normalized.
Here's my take on performScan, too:
Log⍪←onSourceTime+slewTime
> The above APL takes a similar liberty, but instead of a fiat declaring, we just a fiat declare that our data is sufficiently normalized.
Why would you make an assumption like this and try to compare code? If the sample code were simply subtraction then OP would've written subtraction. What use is demonstrating APL code that solves a different problem? If you're gonna do that, instead of removing part of the problem you might as well remove most of the problem and just say the solution is:
⌈/dest-source
And then write some sentence about saying how that's valid because you declare they are normalized even further.
> Here's my take on performScan, too: > Log⍪←onSourceTime+slewTime
How is this valid if you don't actually do any work to calculate onSourceTime or slewTime? You took the last line of performScan and implemented it. You do need to also implement the other lines in the function.
Maybe I'm just completely missing something here.
Thanks for asking!
> You do need to also implement the other lines in the function.
What lines, though?
All we have in the original are three opaque functions composed in a Dovekie pattern, i.e. (f (g x) (h y)). Everything else—besides the discussed overcomplications—is just suggestive naming.
I actually agree with you! The demonstrated code is trivial, and I believe you're right to complain. It's just that APL makes the triviality painfully obvious.
> If the sample code were simply subtraction then OP would've written subtraction
The reason is that functional design patterns are very different from APL design patterns.
In functional programming we manage complexity by hiding it behind carefully-designed functions and their compositional structure. APL manages the same by normalizing data and operating over homogeneous collections. These lead to very different looking code, even without the symbols.
So, where Haskell hides a bunch of calculation behind timeToRotate and timeToAscend, with APL we usually choose to pre-process the data instead. This means that where Haskell needs timeToRotate etc. APL can just use really simple primitive.
While we're granting the author that these opaque functions are correctly implemented, I think it's fair to grant that the APL correctly normalizes it's data. The unshown work is going to be about the same, either way.
Don't you think it's intriguing if we end up with less work and clearly trivial code?
can you normalise harder?
2d is bigger than 1d, but sometimes it is smaller too:
⌈/dest-source
In practice, maybe!
I'm really trying to play as fairly as possible, though. The original suggests that ascention and azimuthal data might need to be treated separately. We really don't know, though.
The basic call graph shape of `(a-b)⌈x-y` also directly mirrors that of timeToMove, so I think I'm taking as few liberties as possible while also showing what APL design patterns can get us.
> This almost doesn’t look like code to me. I would expect any programmer to be able to at least get some sense of what’s going on here, even without knowing Haskell. I don’t think I could say the same thing about this piece of Lisp:
I only understand the Haskell code as well as I do because the comment describes what it does (much more clearly than the docstring on the lisp function). Others have already posted about how unnecessarily verbose the lisp function is, so I won't rehash that part here.
> I got to see first-hand that notions like function calling, sequential processing, and boolean logic are not intuitive
It was a very long time since I first learned programming.
It was QBasic first I think.
It was a very slow process that I had mostly forgotten.
It took me a very long time to grasp what's now very basic control flow.
But I was like 7 or 10.
I remember feeling like "this is pure magic!" so many times so very early on.
Part of what I want to do is rekindle that like pico8 did for me. Somehow.
I honestly expected the article to start like
Smalltalk, Haskell, and Lisp
...walk into a bar.
Smalltalk: “Bartender, please send an instance of Martini to me - selector withOlive: set to true.”
Haskell: “I’ll take beer >>= chill, but only if it’s defined for all inputs and returns IO ().”
Lisp: “(drink ’(ale (temperature cold))) .”
The bartender mutters, “Great ... an object, a functor, and a list.”
Then a janitor walks by sweeping the closures that they left.
Then a FORTH person walks into the same bar...
"cold temperature ale drink"
to which the bartender replies "go you here". The janitor smiles as no sweeping required on a clean stack.
A C guy asks for a beer and gets a beer, the second line of the liquor shelf and the contents of the garbage can before they are thrown out of the bar.
Alternatively:
to which the bartender replies "ok"
2011
The epiphany I had with haskell was understanding why it was so modular. It goes beyond just understanding haskell though and towards understanding the fundamental nature of modularity.
Think about it. What is the smallest most primitive computation that is modular?
If you have a method that mutates things, then that method is tied to the thing it mutates and everything else that mutates it. You can't move a setter away from the entity it mutates without moving the entity with it. It's not modular.
What about an IO function? Well for an IO function to work it must be tied to the IO. Is it a SQL database? A chat client? If you want to move the IO function you have to move the IO it is intrinsically tied with.
So what is the fundamental unit of computation? The pure function. A function that takes an input and returns an output deterministically. You can move that thing anywhere and it is also the smallest computation possible in a computer.
The reason why haskell is so modular is it forces you to segregate away IO and purity. The type system automatically forces you to make the code using the most modular primitives possible. It's almost like how rust forces you to code in a way that prevents certain errors.
Your code in haskell will be made entirely of "pure" functions with things touching IO being segregated away via the IO monad. There is no greater form of modularity and decoupling.
In fact any other computing primitive you use in your program will inevitably be LESS modular then a pure function. That's why when you're programming with OOP or imperative or any other style of programming you'll always hit more points where modularity is broken then you would say haskell. That is not to say perfect modularity is the end goal, but if it was the path there is functional programming.
Ugh, now I want to pick up F# again. I like and identify with many points about Haskell but F# fits like a glove for our ecosystem here.
The problem with using objects and structs in Common Lisp is the verbosity of access, or of setting up shorter access using with-slots and with-accessors.
I fixed that in TXR Lisp:
(defun time-to-move (from-pos to-pos)
(max (time-to-rotate from-pos.azimuth to-pos.azimuth)
(time-to-ascend from-pos.elevation to-pos.elevation)))
The remaining issues are self-inflicted verbose naming. I would make it
;; time to move
(defun ttm (p0 p1)
(max (ttr p0.az p1.az)
(tta p0.el p1.el)))
If the entire program fits on a few pages, you can remember a few abbreviations.
The dot notation is a syntactic sugar for certain S-exps:
a.b.(c x y).d -> (qref a b (c x y) d)
.a.b.c -> (uref a b c)
there exist macros by these names which take care of the rest.
Starting in TXR 300, there can be whitespace after the dot. So while you cannot write in in the popular way:
obj
.(meth arg)
.(next-meth 42)
you can do it like this:
obj.
(meth arg).
(next-meth 42)
this has a lot do with Lisp already having a consing dot.
Lisp doesn't mean being against all syntactic sugars; e.g. we mostly write 'X rather than (quote 'X).
But note that even if we don't have this notation, how the author has tripped over himself to create verbosity. After two nestings of with-accessors he ends up with:
(time-to-rotate from-az to-az) (time-to-ascend from-el to-el)
Here from-az is not a whole lot shorter than (az from)! If he used shorter names like el and az, he would have:
(defun time-to-move (from to)
(max (time-to-rotate (az p0) (az p1))
(time-to-ascend (el p0) (el l1))))
Don't make names for yourself that you then have to shorten with clumsy macros that make the function longer overall.
Another thing is, why does time-to-move take two objects? But time-to-rotate and time-to-ascend work with destructured azimuths and elevations?
(defun time-to-move (from to)
(max (time-to-rotate from to)
(time-to-ascend from to)))
Time to rote and time to ascend could be inseparable. If the device is mounted on an incline, the time to ascend may depend on the azimuth. It may be better to rotate it first than elevate or vice versa, or execute some optimal combined motion.
The moment of inertia of the thing may depend on its elevation; it may be easier to rotate in a high elevation. So just time-to-rotate alone needs the whole object.
Smalltalk used to have left arrow for assignment like I see in the Haskell here.
It's a lot prettier than :=
Smalltalk's left arrow is actually _. Before ASCII was fully standardized to what we know today, ^ was ↑ and _ was ←. This is visible in some later character sets, like PETSCII and the fonts used for Smalltalk. In Smalltalk, you can still type 'a _ foo' and it will assign foo to a.
In Cuis Smalltalk, you can still type
b _ 5
and see
b ← 5
and copy / paste here and see
b := 5
https://cuis.st/
IDK. They all look a little atrocious to me.
But readability has a lot to do with what you are used to.
The only exception might be FORTH. A very well written FORTH implementation (and I mean very well written) probably would be fairly readable to anyone — at least at the higher levels of abstraction.
And Forth was invented by Charles Moore while at NRAO!
https://en.wikipedia.org/wiki/Charles_H._Moore?wprov=sfti1#E...
"In 1968, while employed at the United States National Radio Astronomy Observatory (NRAO), Moore invented the initial version of the Forth language to help control radio telescopes."
W. Richard Stevens wrote a Forth manual for Kitt Peak in the 70s. Now I’m curious how many observatories used the language.
I'll bite. Are you able to share any FORTH code/repos that hit that aesthetic spot for you?
Forth user here (Eforth Subleq/PFE). Scheme is not that bad. Haskell, OTOH, and any ML language, looks very difficult to me.
That FORTH cannot muster the decency to include a simple «U» in its name rather scuppers any hope of an aesthetic parley before the tea’s even been poured.
Joking aside, FORTH’s reliance on the stack as virtually its only data structure – along with its insistence on applying it to everything and everyone – is simply too impractical outside a few niche areas, such as low-level hardware programming or document rendering (hello, PostScript!). I have no doubt a JSON parser in FORTH will emerge as part of Advent of Code 2038, but I can’t imagine it will inspire hesitating potential converts to embrace the language with open arms.