Project Euler: Problem 6

After a nice vacation, the obsession continues.

The sum of the squares of the first ten natural numbers is,
12 + 22 + ... + 102 = 385
The square of the sum of the first ten natural numbers is,
(1 + 2 + ... + 10)2 = 552 = 3025
Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025 385 = 2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum.


  1. module Main where
  3. main :: IO ()
  4. main = do
  5. let x = [1..100]
  6. let sum2 = sum $ map (^2) x
  7. let sum1 = (sum x) ^2
  8. print (sum1 - sum2)

  1. #!/usr/bin/python3
  2. """Project Euler solution using generators."""
  4. sum1 = 0
  5. sum2 = 0
  7. for i in ((x,x ** 2) for x in range(1,100+1)):
  8. sum1 += i[0]
  9. sum2 += i[-1]
  11. print(sum1 ** 2 - sum2)

  1. #!/usr/bin/perl
  3. my $sum1 = 0;
  4. my $sum2 = 0;
  6. for (1..100)
  7. {
  8. $sum1 += $_;
  9. $sum2 += $_ **2;
  10. }
  12. print ($sum1 **2 - $sum2);

  1. public class problem6
  2. {
  3. public static void main( String[] args)
  4. {
  5. int sum1 = 0;
  6. int sum2 = 0;
  8. for( int i = 0; i < 100; i++)
  9. {
  10. sum1 += i;
  11. sum2 += i ^ 2;
  12. }
  14. System.out.println(sum1 ^ 2 - sum2);
  15. }
  16. }

I admit writing a solution for Java is kind of a cheat. It's exactly the same as the PERL solution, minus some language differences. But since I started playing around with Android, I've started to spend more time working with Java and it has just bled into this project.

Run time comparison:

Because Java has to be compiled, I'm now using posting the time for the compiled version of the Haskell solution.
Haskell: 0.004s
Python: 0.285s
Perl: 0.005s
Java: 0.625s

I'll refrain from insulting the speed of Java if you do. ;)


Because of the simplicity of the solution, I tried to play around with the answers between Haskell and Python. Originally my Python solution looked more like my Haskell one, but after learning about generators I decided I would use one for the Python solution. All in all I think it would out rather well.

Questions, comments, insults?? (And not about me, about the code... I know what you people are thinking!)

Wrong operand in Java code

In the Java code you should change:

n ^ 2


(int) Math.pow(n, 2);


n * n

^ is the XOR bitwise operand, not exponentiation.

Thank you xelamitchell for

Thank you xelamitchell for pointing that out. I will update the code in my repo to make it correct. I guess my java is weaker than I thought... oh well.


  1. using System;
  3. namespace Problem6
  4. {
  5. class Program
  6. {
  7. static void Main()
  8. {
  9. Console.WriteLine(SquareOfSum()-SumOfSquares());
  10. Console.ReadLine();
  11. }
  13. static double SumOfSquares()
  14. {
  15. double sum = 0;
  17. for(var i = 0; i <= 100; i++)
  18. {
  19. sum += Math.Pow(i,2);
  20. }
  21. return sum;
  22. }
  24. static double SquareOfSum()
  25. {
  26. double sum = 0;
  27. for (var i = 1; i <= 100; i++)
  28. {
  29. sum += i;
  30. }
  31. sum = Math.Pow(sum, 2);
  32. return sum;
  33. }
  35. }
  36. }


Very nice post!

Similar to "Pencil and

Similar to "Pencil and Paper":

  1. f n = (div (n*(10+1)) 2)^2- div (2*n^3+3*n^2+n) 6
  2. main = print $ f 100

Runs in similar time for n = 100, but operates at O(1) not O(n). Also when n = 100, this uses less than half the memory :)

Arguably more fluent Perl

  1. use feature 'say';
  2. use List::Util 'sum';
  4. my @nums = (1..100);
  5. say sum(@nums)**2 - sum(map $_**2, @nums);

Andrew, That is a very

Andrew, That is a very beautiful, and short, Perl answer. Thank you for sharing it. I really am bad at remembering that Perl has those higher order functions.

So you measured the elapsed

So you measured the elapsed time by just a time command preceding each program? If so it's really silly to do so because you're not taking JVM startup time into account.

Hey Anonymous, Thank you for

Hey Anonymous,

Thank you for your comment and you have a point a valid point. My runtime comparisons do not account for the VM startup. But I would like to add that I'm not really trying to prove anything with these runtime comparisons and in no way should they be taken seriously. I'm merely doing them for the sake of seeing how these similar algorithms act in different languages.

Python version

Generators are the right tool for the job in Python, but you used them to do the job the same way you did in Perl / Java. You should do the job the Pythonic way --- what a previous comment calls the 'right' way, and not just because it's faster.

In this context, think of generators either as replacing a loop or as replacing a map statement: you didn't use them to do either.

Replacing the potential map statement with a generator ends up with code like you wrote for Haskell and like that given in a previous comment:

  1. nums = range(1, 100+1)
  2. sum1 = sum(nums)
  3. sum2 = sum(x ** 2 for x in nums)
  4. print(sum1 ** 2 - sum2)

or in one statement print(sum(range(1, 101)) ** 2 - sum(x ** 2 for x in range(1, 101)))

Using your code but removing the loop and would give:

  1. nums = zip(*((x, x**2) for x in range(1, 100+1)))
  2. sum1 = sum(nums[0])
  3. sum2 = sum(nums[-1])
  4. print(sum1 ** 2 - sum2)

Getting rid of the itemgetting and unpacking instead:

  1. nums1, nums2 = zip(*((x, x**2) for x in range(1, 100+1)))
  2. sum1 = sum(nums1)
  3. sum2 = sum(nums2)
  4. print(sum1 ** 2 - sum2)

Getting rid of the packing would put you back at the same code as above, where the generator is just replacing a map statement; this is the Pythonic (simplest, most elegant) way to do the job.

We really should call these

We really should call these 'generator expressions' not just generators.

Hey agf, thanks for the

Hey agf,

thanks for the little tutorial on generator expressions as well as putting in your solution to this problem. Hopefully other people will see it and learn form it as well.

(I love that I get to learn things from the comments on my blog. ;))

pencil and paper

sum(n) -> n(n+1)/2
sum(n^2) -> n(n+1)(2n+1)/6
sum(n)^2 - sum(n^2) -> n(n+1)(6n^2 -2n -4)/24

Hi Gary, I am blown away by

Hi Gary,

I am blown away by this answer! You've reduced the amount of computations required to get the answer significantly.

I've got to ask where you learned that sum -> n(n+1)/2?

One of those stories

This is one of those famous tales good math teachers like to tell:

Pretty sure I learnt sum(n)

Pretty sure I learnt sum(n) somewhere in maths in primary school. sum(n^2) I knew would be some cubic polynomial, googled for the coefficients.

This is another one that's

This is another one that's most naturally done as a one-liner in perl 6:
say ([+] 1..100) ** 2 - [+] (1..100).map: * ** 2


Using Clojure it's possible to salvage some dignity for the JavaVM:

(defn euler006 [n]
[numbers (range 1 (inc n))
square-of-sum (Math/pow (reduce + numbers) 2)
sum-of-squares (reduce + (map #(Math/pow % 2) numbers))]

(- square-of-sum sum-of-squares)))

(time (println (euler006 100)))

Elapsed time: 4.4 msecs, 0.0044s in your format. For comparison the perl code runs in 0.007s on my machine.

This is an interesting series of posts, thanks for sharing! I've been working through Project Euler for a while now, it's a great way to learn.

Hey redacted, Thanks for

Hey redacted,

Thanks for posting. Also thanks for sharing some clojure code. I've recently become aware of Scala and am intellectually fascinate with both projects. Now if only the Java people would incorporate some of these good ideas into the core language. Might inject some life back into the Java language and community. But that is rant for another time. ;)

Thanks for the compliment too. I'll keep writing 'em if you keep reading 'em.

You're doing the generators

You're doing the generators wrong:

nums = range(1,101)
print sum(nums)**2 - sum(n**2 for n in nums)

Timing results:

% python -m timeit "nums = range(1,101); sum(nums)**2 - sum(n**2 for n in nums)"
10000 loops, best of 3: 33.1 usec per loop

That's .0331s in your format.

Your code is even faster than

Your code is even faster than the equivalent in NumPy:

  1. import numpy as np
  2. np.sum(np.arange(1,100+1)) ** 2 - np.sum(np.arange(1,100+1) ** 2)

The operation itself takes 151µs on an eepc901. Your code needs 114µs.

> That's .0331s in your format.
No, 33.1µs is 0.0000331s or 0.0331ms.

Comparing this generator and the one from the first comment

For reference, the suggested comment code is 278 / 18.5 = ~15 times faster on my computer than the original code from the post:

Running the original script:
python -m timeit "sum1 = 0" "sum2 = 0" "for i in ((x,x ** 2) for x in range(1,100+1)):" " sum1 += i[0]" " sum2 += i[-1]" "print(sum1 ** 2 - sum2)"

Output is:
1000 loops, best of 3: 278 usec per loop

Running code from first comment:
python -m timeit "nums = range(1,101); sum(nums)**2 - sum(n**2 for n in nums)"

Output is:
100000 loops, best of 3: 18.5 usec per loop

Hey Matt & Peter, Thanks for

Hey Matt & Peter,

Thanks for commenting.

To be honest, speed wasn't the goal of this code. At the time I was more trying to work with generators, and I thought it would be a good way to incorporate that into something I was doing at the time.

One question I do have though, are you guys using python 2 or 3? I found that this makes a big difference with my code. My posted code is in Python3 which tends to be slower with lists & generators. And if I wasn't at work currently I'd be testing things right now. ;)

Python version

I should have said in my post, but I used Python 2.6.1 for both timings, on

Intel Core2 Duo CPU
T7500 @2.20GHz
789MHz, 1.99GB of RAM

Though, honestly, the point of my post wasn't to criticize the code... I wanted to see the *relative* difference of the pieces of code on the *same* machine, and I dug a (very little) bit to see how to use 'timeit' on a multi-line python snippet (that had mandatory indentations), and thought I'd leave both the timings and how to do the multi-liner 'timeit' for posterity. I myself am a python newbie coming from years of perl, so I don't really have anything substantive to offer, so I can do a little grunt work like this...

Cool stuff, though -- fyi, I'm following your posts through 'PlanetPython' on Google Reader.

Keep posting!

No worries about the

No worries about the comments, no offense was taken (by your's or Peter's). I am just aware of what I consider a big difference in performance between python2 and python3.

I wasn't aware of timeit until Peter said something either. I will certainly be plugging away on it later this evening.

Good luck with your Python adventure. I'm still figuring a lot myself, so if you ever want to collaborate on figuring something out, I'm game.

If you made it this far down into the article, hopefully you liked it enough to share it with your friends. Thanks if you do, I appreciate it.

Bookmark and Share