List and the cons operator (:)
are very common in Haskell. Cons is our friend. But sometimes I want to add to the end of a list instead.
xs `append` x = xs ++ [x]
This, sadly, is not an efficient way to implement it.
I wrote up Pascal's triangle in Haskell, but I had to use the ++ [x]
anti-idiom:
ptri = [1] : mkptri ptri
mkptri (row:rows) = newRow : mkptri rows
where newRow = zipWith (+) row (0:row) ++ [1]
imho, this is a lovely readable Pascal's triangle and all, but the anti-idiom irks me. Can someone explain to me (and, ideally, point me to a good tutorial) on what the idiomatic data structure is for cases where you want to append to the end efficiently? I'm hoping for near-list-like beauty in this data structure and its methods. Or, alternately, explain to me why this anti-idiom is actually not that bad for this case (if you believe such to be the case).
[edit] The answer I like the best is Data.Sequence
, which does indeed have "near-list-like beauty." Not sure how I feel about the required strictness of operations. Further suggestions and different ideas are always welcome.
import Data.Sequence ((|>), (<|), zipWith, singleton)
import Prelude hiding (zipWith)
ptri = singleton 1 : mkptri ptri
mkptri (seq:seqs) = newRow : mkptri seqs
where newRow = zipWith (+) seq (0 <| seq) |> 1
Now we just need List to be a class, so that other structures can use its methods like zipWith
without hiding it from Prelude, or qualifying it. :P
Keep in mind that what looks poor asymptotics might actually not be, because you are working in a lazy language. In a strict language, appending to the end of a linked list in this way would always be O(n). In a lazy language, it's O(n) only if you actually traverse to the end of the list,in which case you would have spent O(n) effort anyway. So in many cases, laziness saves you.
This isn't a guarantee... for example, k appends followed by a traversal will still run in O(nk) where it could have been O(n+k). But it does change the picture somewhat. Thinking about performance of single operations in terms of their asymptotic complexity when the result is immediately forced doesn't always give you the right answer in the end.