 19 May, 2014 7 commits


Mike Bostock authored

Mike Bostock authored

Jason Davies authored
Postorder traversal alone causes all parent values to be reset to zero *after* accumulating child values. An additional preorder traversal resets all parent values to zero first, and the postorder traversal can then accumulate child values in their parents. This only affects sticky treemaps.

Mike Bostock authored

Mike Bostock authored

Mike Bostock authored

ljani authored

 18 May, 2014 5 commits


Mike Bostock authored
removing .js extension from the 'main' property

Mike Bostock authored

Mike Bostock authored

Mike Bostock authored

Mike Bostock authored

 17 May, 2014 1 commit


Mike Bostock authored
Rather than creating a temporary _tree hash on the tree nodes to store temporary variables needed to compute the tree layout, the tree is wrapped. This eliminates the risk of a namespace collision, and eliminates the need to subsequently delete temporary variables. (They will be garbage collected.)

 13 May, 2014 1 commit


Erin Jane authored

 22 Apr, 2014 1 commit


Mike Bostock authored

 13 Apr, 2014 4 commits


Mike Bostock authored

Mike Bostock authored

Mike Bostock authored

Mike Bostock authored

 11 Apr, 2014 1 commit


Jason Davies authored
Originally we were using Welford’s algorithm, but this is primarily useful when computing the variance in a numerically stable manner, since Welford’s approach requires an incremental mean. I’ve removed a test for the mean of more than one instance of Number.MAX_VALUE as this is unlikely to occur in practice; most likely this was the reason I used Welford’s algorithm in the first place. There’s a paper [1] comparing various algorithms for computing the mean, and Welford’s is actually slightly less accurate than the naïve approach. There are some more accurate approaches but I think it’s overkill for d3.mean. [1] Youngs, Edward A., and Elliot M. Cramer. "Some results relevant to choice of sum and sumofproduct algorithms." Technometrics 13.3 (1971): 657665. Related: #1842.

 10 Apr, 2014 2 commits


Mike Bostock authored

Jason Davies authored

 08 Apr, 2014 3 commits


Mike Bostock authored

Mike Bostock authored

Mike Bostock authored

 06 Apr, 2014 4 commits


Mike Bostock authored

Mike Bostock authored

Mike Bostock authored
If there are a lot of matching numbers, it’s faster to do direct string equality comparisons than it is to coerce to a number and compare numerically.

Mike Bostock authored

 04 Apr, 2014 1 commit


Jason Davies authored
Fixes #1823; spurious closePath events were being generated for degenerate polygons due to generation of empty polygons and rings in rare cases.

 24 Mar, 2014 10 commits


Mike Bostock authored

Mike Bostock authored
I’m not entirely sure this is the most useful behavior, but since typeof null is "object" and +null is 0, interpolating to null is equivalent to interpolating to the number zero.

Mike Bostock authored

Mike Bostock authored
Fixes #1748.

Mike Bostock authored
Fixes #1773.

Mike Bostock authored
The point of this method is to pick the right precision for you!

Mike Bostock authored
Rather than overload the meaning of precision to bias the selection of the SI prefix, always use the standard SI prefix, and use the precision in the same sense as with fixed digits: the number of digits after the decimal point.

Mike Bostock authored
For reasons that I can’t recall, the SIprefix behavior was different for small numbers (between 1 and 1) than it was for large numbers. This commit enforces consistent behavior, so that the coefficient is always in the range [1, 1000), like in engineering notation. For example, the old d3.format("s") would display 0.01 as "0.01", whereas the new behavior displays it as "10m".

Mike Bostock authored
When a SIprefix format (type "s") is passed to scale.tickFormat, compute a suitable SIprefix based on the maximum value in the range, and then use that prefix for all ticks rather than computing the SIprefix on a pertick basis.

Mike Bostock authored
Fixes #1717. Turns out, 1 % 1 is 0!
