### PurpleThinker's blog

By PurpleThinker, 3 months ago,

A few days ago, someone much smarter than me shared with me the following problem. He said it involves "nothing else but simple arrays and for-loops, but has some weird observations to make". I couldn't solve it, even though the solution is simple enough that it can probably be understood by higher-rated pupils on CF. I thought it might be interesting to you, so here it is:

You are given an array $a$ with $n$ positive integers (you don't know anything about their maximum value). Find out the maximum difference between two elements if the array $a$ would be sorted.

Example:
Input: $n = 4, a = [11, 2, 7, 5]$
Output: $4$, because if we were to sort the array $a$, then we would have $[2, 5, 7, 11]$, so the maximum difference between two adjacent elements is $4$.

Of course, you can just sort the array in $O(n \log n)$ and find the difference that way. Value-based sorting (like radix sort) is not an option because you don't know what the maximum value is. But there is actually a way to find it in $O(n)$, without sorting the array.

Hint
Solution

Hope you enjoyed the problem!

• +274

By PurpleThinker, 17 months ago,

At my university, we are doing this semester the algorithm course. The course is easy for me, since I know most of the course syllabus because of my high-school CP background. However, yesterday I learnt at this course a quite surprising thing. We discussed about time complexities: the big $O$, $\Omega$, and $\Theta$. Take a look at this (mostly) formal definition of $O$:

Spoiler

Notice that any function $g$ works, as long as it's $\leq$ than $f$ for large enough $n$. So the big $O$ is just an upper bound for the number of operations of an algorithm, not necessarily the tightest upper bound for the complexity of an algorithm. In other words, when you say "the number of operations of this algorithm is $O(n^2)$", it actually means that the algorithm does not do significantly more operations than $n^2$. That means that not only an algorithm with $n^2$ steps is $O(n^2)$, but also algorithms with $n \space log \space n$ or $n$ operations are also $O(n^2)$!

The notation that actually corresponds to the worst-case running time we are talking about in CP seems to be the $\Theta$ (big theta) notation.

Formal definition of big theta

For example, binary search does in the worst case $\Theta(log \space n)$ steps.

So my question is: why are we using the big-O notation for time complexity in CP, when it's not fully accurate? Or have I misunderstood something about these notations? I don't want to be pedantic, but I'm geniunely wondering why we are doing this.

• +84

By PurpleThinker, 2 years ago,

An interesting milestone was achieved: 100000 blog entries on Codeforces! Codeforces has come a long way and 100000 blogs is yet another reminder of the international CP community that Codeforces helped to create. It's a good opportunity to thank MikeMirzayanov (and the other CF admins) for this amazing platform, which is not just a contest & problem archive, but also a means for people to gather from all around the world to discuss about CP. I hope that Codeforces will last for the years and decades to come (and thus also get to the millionth blog post :) ).

• +295

By PurpleThinker, 3 years ago,

Hello, Codeforces! For many of us, the A + B problem was the first one we've solved and we think of it as just a really easy problem. But what if I told you there is a way to flex your programming skills by solving this problem with style? On this wonderful day of March 32nd, I will present a few ways to solve this interesting problem. I will only consider a and b strictly larger than 0.

# Noob approach

std::cout << a + b;


# Almost-noob approach

Notice that we can interpret a + b as a incrementing a number a times, then b times. We get:

int ans = 0;
for (int i = 1; i <= a; i++) ans++;
for (int i = 1; i <= b; i++) ans++;
std::cout << ans;


# Slightly clever approach

Of course, we cannot stop there. A wise man once said: "every competitive programming problem is a dynamic programming problem if you try hard enough". So... let's be a little bit more imaginative! We define dp[i][j] as the result you get if you increment a number i times, then j times. So the final result will be dp[a][b] and the recurrence relation is:

dp[i][j] = 1 + max(dp[i-1][j], dp[i][j-1]), for all i <= a and j <= b.

(the max function is, of course, for aesthetic purposes)

# Good programmer approach

Some competitive programming problems require you to transform the task given in the statement into something more manageable and perhaps easier to visualise... what better way to visualise a problem than use graphs! We construct a graph and relabel its nodes like this:

We can see that a + b is in fact equal to the distance between the node labeled a in the upper part and the node labeled b in the lower part. You can now use your favourite graph-traversal algorithm to solve this problem.

# What separates good programmers from LEGENDARY programmers

A programming challenge isn't just about algorithms; it can also be about the programming language you use! Of course, if you want to do it the boring way, sure, use C++, Java or Python... However, if you want something more interesting, I will present to you the best programming language in the world: Brainfuck!

Yep, that really exists. Brainfuck is a very minimalist and easy to learn language. In fact, it only has 8 commands: ><+-.,[]. Who needs user-defined functions and comparators and classes that always create bugs, when you've got Brainfuck! For example, this is a "Hello World!" program in this wonderful language:

++++++++[>++++[>++>+++>+++>+<<<<-]>+>+>->>+[<]<-]>>.>---.+++++++..+++.>>.<-.<.+++.------.--------.>>+.>++.


That's really it: no clunky and buggy functions and IO stuff. Considering the modern trend of simplifying everything (for example, logos and UIs), sooner or later, the world will throw away C++, Python and Java and use a simplist language, like Brainfuck. Legendary programmers already use this language of the future and solve all competitive programming problems in Brainfuck.

As I just said, Brainfuck is a very easy to learn language. In fact, the implementation of the graph approach to the A + B problem is so elementary, that it is left as an exercise for the reader.

• +316