Cale 2017-03-09 07:52:54
sdrodge: When you use IM.insertWith (+) there, it won't actually calculate the sum, it just sticks the unevaluated expression into the map
Sonolin 2017-03-09 07:52:59
I always just use lazy, and when I come into problems or issues then look at strict variants
Sonolin 2017-03-09 07:53:10
profiling with ghc is a breeze IMO
sdrodge 2017-03-09 07:53:31
Is that the general practice, then?
sdrodge 2017-03-09 07:53:38
Write lazy and then profile it and fix it?
merijn 2017-03-09 07:54:02
sdrodge: Well, if you get better at understanding the implications of laziness you can predict them without profiling
Cale 2017-03-09 07:54:09
So if you don't observe the values in the map, you'll end up with large expressions of the form (...((1 + 1) + 1) + ...) + 1) + 1
sdrodge 2017-03-09 07:54:12
No real guidelines to follow for inserting strictness annotations that are "obviously" right the first time you write it?
merijn 2017-03-09 07:54:40
sdrodge: Well, there are heuristics and guidelines, but they are rather tricky to articulate
sdrodge 2017-03-09 07:54:46
Cale: Is that a real cost given that I'm storing all the past versions of the map anyway?
merijn 2017-03-09 07:54:53
At least, I realise I have them, but don't know how to articulate them
sdrodge 2017-03-09 07:55:01
merijn: Know of any good write ups?
Cale 2017-03-09 07:55:07
Yes, an expression like that requires much more memory than an Int would otherwise take
monochrom 2017-03-09 07:55:21
The real guideline is to learn laziness and decide, very consciously, whether you want it or not.
Cale 2017-03-09 07:55:25
and if you wait too long before eventually evaluating it, you'll end up with a stack overflow
Cale 2017-03-09 07:55:42
because (+) is strict in both its arguments
merijn 2017-03-09 07:55:46
sdrodge: Starting point for understanding laziness would be: https://hackhands.com/guide-lazy-evaluation-haskell/
Cale 2017-03-09 07:55:52
So you'll start out with (...) + 1
merijn 2017-03-09 07:56:18
sdrodge: A more in-depth follow up (assumes you know a little bit about C/asm) would be the STG paper: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.53.3729&rep=rep1&type=pdf
Cale 2017-03-09 07:56:25
and the outermost (+) will say, "Oh, I need to match on my first argument", and that pattern match will wait on the stack while the first argument is evaluated
Cale 2017-03-09 07:56:32
and it too will be of the form (...) + 1
Cale 2017-03-09 07:56:48
and if you have like a million of those, you'll blow up your stack
sdrodge 2017-03-09 07:58:02
merijn: My problem is that I already know how Haskell does laziness, but I find it hard to convert that knowledge into good strictness annotations while writing code.
Cale 2017-03-09 07:58:34
My usual rule of thumb is to break things down by whether operations are taking or producing things with many or few separately evaluatable parts
sdrodge 2017-03-09 07:58:49
Cale: I see why that would be true if I were only keeping the current map around, but aren't I paying that memory penalty anyway by keeping all the maps around?
Cale 2017-03-09 07:59:04
It's the cases where something is taking many separate parts, and condensing them down into something with few parts that you want strictness.
monochrom 2017-03-09 07:59:08
That's probably just because your own strictness annotation is not enough. Libraries you use have a say too. It's why Data.Map.Strict exists. Not something you can do outside the library.
merijn 2017-03-09 07:59:19
sdrodge: Why do you think you're keeping all the maps around?
Cale 2017-03-09 07:59:36
sdrodge: No, this is a separate penalty
sdrodge 2017-03-09 07:59:55
merijn: Because I'm keeping all the old maps in a sequences as state.
sdrodge 2017-03-09 07:59:59
*in a sequence
Cale 2017-03-09 08:00:00
This isn't the *Map*, this is the *Int* inside.
monochrom 2017-03-09 08:00:34
Yes, you have "one single Int" that takes up 100MB because it's still in a "1+1+1+..." form.
Cale 2017-03-09 08:00:37
You're storing a big complicated expression for that Int which, once you evaluate it, will be replaced by a machine word or two.
aesthetik 2017-03-09 08:01:24
Click on the link and get $110!
aesthetik 2017-03-09 08:01:24
https://richmondberks.com/?ref=rbd118972
aesthetik 2017-03-09 08:01:24
- Make 1,5% Daily profit !
aesthetik 2017-03-09 08:01:24
- Invite a friend and get 1$ as a gift !
aesthetik 2017-03-09 08:01:24
- Comissions of 10% !
sdrodge 2017-03-09 08:01:52
Okay, yes, good point.
Cale 2017-03-09 08:02:30
sdrodge: Doing that sooner rather than later also makes the garbage collector's life easier, and you'll save a bunch of time on allocations.
sdrodge 2017-03-09 08:02:46
So is the generalization here that I should use insertWith' unless I specifically need the value inserted to be calculated lazily?
SLi 2017-03-09 08:03:17
I've been pondering way too long about how to best implement this (as always seems to happen when I program in Haskell :(). Essentially, I'm writing a tool that works on expressions (and-inverter-graphs, i.e each node is a binary AND operator, and the two inputs to each gate may independently either be inverted or not) and applies local rewrites. But there's a wrinkle: Expressed as trees, the expressions are huge and share a lot of subexpressions, so the
SLi 2017-03-09 08:04:03
input is provided as a DAG, and should be kept as a DAG. So need sharing. Lots of times I would like to essentially rewrite the DAG node, i.e. the expression everywhere in the tree, but other times I might want to essentially unshare it and rewrite it in only parts of the tree.
monochrom 2017-03-09 08:04:03
Why are you in such a hurry for generalizations and no-brainer rules of thumb? But yes.
Cale 2017-03-09 08:04:03
Usually... it depends of course, but yeah.
SLi 2017-03-09 08:04:03
So, I guess I could implement this as described, but I wonder there's some abstraction that could help me, like abstracting the "sharing container with ability to unshare" idea somehow.
Cale 2017-03-09 08:04:32
Sometimes it'll be profitable to store a lazy expression for something in the Map -- if computing the outcome will be expensive and you might not need the result, or if you expect the space for the result to be larger than the space for the expression.
sdrodge 2017-03-09 08:04:37
monochrom: Because it's really helpful to have guidelines/rules to follow when programming as to not overload working memory.
monochrom 2017-03-09 08:05:02
(Fortunately this is still not a no-brainer because "do I need laziness" still requires conscious rational reasoning.)
sdrodge 2017-03-09 08:05:06
If I have to spend time thinking carefully about whether I want the strict or the lazy version every single time I write a line of code, that's not good.
Cale 2017-03-09 08:05:23
Well, you kind of do.
sdrodge 2017-03-09 08:05:33
Sure, but it's a matter of how hard I need to think.
Cale 2017-03-09 08:05:38
It's just that 95% of the time, the answer is "it doesn't matter"
monochrom 2017-03-09 08:05:48
It has been pretty easy for me.
Cale 2017-03-09 08:05:49
and then the other 5% of the time, it's one way or the other and very important
sdrodge 2017-03-09 08:06:23
Oh, I see.
sdrodge 2017-03-09 08:06:24
So it's more that the cost of computing this is low enough that strictness is obviously right.
sdrodge 2017-03-09 08:06:48
Even though the value may not be needed.
SLi 2017-03-09 08:07:42
I've also looked at graph rewriting packages in hoogle. Of those, I think this looks most promising at least for testing, but I think it lacks the sharing part. https://hackage.haskell.org/package/graph-rewriting
Cale 2017-03-09 08:08:16
sdrodge: yeah, in this case, you're not saving anything by deferring that addition until later
monochrom 2017-03-09 08:08:16
(Fortunately it is still not a no-brainer because for every line you still have to consciously rationally reason whether that line is in the 95% or in the 5%. Resistance is futile.)
sdrodge 2017-03-09 08:08:16
I'm really not looking for a no-brainer.
Cale 2017-03-09 08:08:16
monochrom: Yeah
sdrodge 2017-03-09 08:08:16
I'm just looking for a better feel about how to frame the question so that I can figure it out faster in the future.
Cale 2017-03-09 08:08:16
(though I'm not sure it's actually fortunate)
SLi 2017-03-09 08:08:16
I agree I've also had problems with laziness everywhere causing hard-to-debug memory issues. Trying strict-by-default Haskell is on my list of things to do.
monochrom 2017-03-09 08:08:17
That is called experience.
sdrodge 2017-03-09 08:08:17
Instead of the situation I currently have where I write a program and then wonder, huh, did I miss any obvious strictness annotations?
sdrodge 2017-03-09 08:08:17
Okay, but it's even better if you can learn from others' experience instead of your own.
SLi 2017-03-09 08:08:32
Actually, almost every time I write a Haskell program that takes a moderately large input and outputs something simple I have a problem that the memory usage grows to gigabytes. I've been told religiously avoiding lazy IO helps.
monochrom 2017-03-09 08:08:39
Yes. You have a USB port in your head? I could download mine to you.
sdrodge 2017-03-09 08:08:57
So one very useful tip I seem to have learned today is that in general for inexpensive operations on small data (like addition on ints), you want to demand strictness.
merijn 2017-03-09 08:08:59
SLi: Well, lazy IO is rather rarer than people seem to think it is
merijn 2017-03-09 08:09:09
Like, there's only 5 or so lazy IO operations in base
monochrom 2017-03-09 08:09:17
Or you can send me 100 bitcoins to commission me to write a whole book on this.
SLi 2017-03-09 08:09:17
merijn: Hmm, ok.
merijn 2017-03-09 08:09:24
sdrodge: Right, that's a reasonable heuristic
sdrodge 2017-03-09 08:09:32
monochrom: You could see if there's a way to gel your experience into a great article with worked examples!
merijn 2017-03-09 08:09:45
sdrodge: He already has that for most topics :p
monochrom 2017-03-09 08:09:49
Yes. You can send me 100 bitcoins to commission me to write it.
SexHendrix 2017-03-09 08:09:50
just written what could be the worlds slowest numerical integrator
SexHendrix 2017-03-09 08:09:54
http://lpaste.net/353356
SexHendrix 2017-03-09 08:10:13
looking for some obvious areas for speeding it up
sdrodge 2017-03-09 08:10:27
monochrom: Look, I just asked if there was such a write-up somewhere. You're the one attacking me for asking the question and suggesting this knowledge is impossible to transfer anywhere other than hard-won experience.
sdrodge 2017-03-09 08:10:34
Cool your jets.
monochrom 2017-03-09 08:10:45
I just asked for money.
sdrodge 2017-03-09 08:10:49
:P
monochrom 2017-03-09 08:10:52
You need to take a walk.
monochrom 2017-03-09 08:11:31
And I need a haircut.
SLi 2017-03-09 08:11:38
I need both.
monochrom 2017-03-09 08:13:05
But what merijn said. http://www.vex.net/~trebla/haskell/lazy.xhtml
monochrom 2017-03-09 08:13:40
The foldr and foldl examples are not too different from the insertWith situation.
sdrodge 2017-03-09 08:13:40
Yeah. I mean, I understand the foldr and foldl examples. I just find it difficult to apply that knowledge when writing larger programs.
sdrodge 2017-03-09 08:14:14
When you guys write Haskell, do you actually stop and think about whether strictness or laziness is correct for each expression that you write?
monochrom 2017-03-09 08:16:57
Yes I do.
monochrom 2017-03-09 08:17:05
And no, not taxing.
Cale 2017-03-09 08:17:05
sdrodge: The general rule of thumb that I use is that whenever you're taking many separately evaluatable things, and combining them together into a single indivisible value (which can't be partially evaluated), that almost always describes the places where you want the strictness annotations.
sdrodge 2017-03-09 08:17:05
Cale: Interesting.
sdrodge 2017-03-09 08:17:05
Thanks!
merijn 2017-03-09 08:17:05
sdrodge: Well, I just kinda have a feel for things, like Cale describes, and that sets of my warning "I need to pause and think" reflex
kuribas 2017-03-09 08:17:05
sdrodge: for small numerical functions, lazyness can be a high cost. I had a 10X performance increase by forcing a tuple, because keeping the closure and garbage collecting it was much more expensive.
sdrodge 2017-03-09 08:17:05
I guess maybe the misconception that I have is that it doesn't always seem possible to reason about stricness locally.
mniip 2017-03-09 08:17:05
sdrodge, depends on what mode I'm programming in
sdrodge 2017-03-09 08:17:05
But maybe I'm missing something.
merijn 2017-03-09 08:17:05
kuribas: Well, forcing it might also help the strictness analyser
Cale 2017-03-09 08:17:05
sdrodge: Well, it's not.
mniip 2017-03-09 08:17:05
sometimes I'm writing code so that it runs well, sometimes I only need a qualitative result
Cale 2017-03-09 08:17:05
That's not a misconception -- how expressions are evaluated are something that can require insight into what the whole program is doing in general.
kuribas 2017-03-09 08:17:05
merijn: it wasn't strict, because both values depended on the same computations.
sdrodge 2017-03-09 08:17:05
That's the part that bothers me.
sdrodge 2017-03-09 08:17:05
Because whole program reasoning doesn't scale.
merijn 2017-03-09 08:17:09
kuribas: No, I meant that adding the strictness might lead the strictness analyser to kick in and unbox things
merijn 2017-03-09 08:17:21
sdrodge: That's a downside yes, but there's also an upside!
kuribas 2017-03-09 08:17:21
merijn: right
sdrodge 2017-03-09 08:17:31
Don't worry, I'm not upset about laziness.
sdrodge 2017-03-09 08:17:33
I love it.
merijn 2017-03-09 08:17:40
sdrodge: If you have code that is "too strict", making it lazier requires changing ALL the code
SLi 2017-03-09 08:17:42
So, I wonder if it would be possible to write some... thing that takes any type (Hashable a, Eq a, Functor f) => f a and produces an isomorphic type with observable sharing of a. Though that still wouldn't work for sharing subtrees in trees. Hmh.
sdrodge 2017-03-09 08:17:46
I just want to learn how to manage the lazy/strict choice better.
sdrodge 2017-03-09 08:21:14
Like Control.Monad.State.{Lazy, Strict}
sdrodge 2017-03-09 08:21:34
Do you tend to default to Lazy or Strict?
sdrodge 2017-03-09 08:21:41
Is the answer the same or different for ST?
Cale 2017-03-09 08:21:42
For Control.Monad.State I just import Control.Monad.State
sdrodge 2017-03-09 08:21:51
So that's the Lazy one.
Cale 2017-03-09 08:21:55
which I believe gets me the Lazy one, yeah
Cale 2017-03-09 08:22:37
That doesn't matter too much in practice -- it's referring to whether the pairs are being forced in the definition of bind
Cale 2017-03-09 08:23:28
i.e. whether it's x >>= f = State (\s -> let (v,s') = runState x s in runState (f v) s')
Cale 2017-03-09 08:23:51
or x >>= f = State (\s -> case runState x s of (v,s') -> runState (f v) s')
monochrom 2017-03-09 08:25:41
Actually they use "(v, s')" vs "~(v, s')" now.
Cale 2017-03-09 08:26:57
fair enough (should be the same thing)
Cale 2017-03-09 08:29:15
Note that even with Control.Monad.State.Strict, you can still have the state itself be an unevaluated expression
monochrom 2017-03-09 08:29:15
(and an implicit case. Why implicit? Because they obtained the "case" from do-notation. Why do-notation? Because it's StateT, even if it's StateT Identity)
Cale 2017-03-09 08:29:15
ah, right, that makes sense :)
sdrodge 2017-03-09 08:29:15
So it seems I am missing something here. What is the significance of forcing the tuple directly in the bind vs. not?
monochrom 2017-03-09 08:29:15
Yes this one is very sutble.
monochrom 2017-03-09 08:29:15
My http://lpaste.net/41790/ shows what could happen.
Cale 2017-03-09 08:31:44
In my experience, people usually don't care and import Control.Monad.State, which makes that the sensible default regardless of its semantics
monochrom 2017-03-09 08:32:18
Ah but they have a high chance of consuming more memory than they need.
dolio 2017-03-09 08:32:57
Strict used to avoid some stack overflows from really big computations. But I guess that's gone now.