[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Whta the fuck do you use this Big-Oh bullshit for?
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 52
Thread images: 1
File: bigo.jpg (176 KB, 1628x1100) Image search: [Google]
bigo.jpg
176 KB, 1628x1100
Whta the fuck do you use this Big-Oh bullshit for?
>>
Comparing algorithm runtime and memory usage.
>>
It's related to computable problems complexity when compares to the growth of the problem.
I'm not sure however.
>>
It's a concept used to evaluate runtimes based on input size. It's not something you physically type up or can actually see, it's more of an abstract idea. A real world equivalent would be like estimating how long it should take a sprinter to run some arbitrary distance, given that you know how fast they run for smaller amounts of distance. Building on this, imagine having a stable of sprinters, and long distance runners. Then you can estimate who would be best for which race (which algorithm is best for which problem) based on the distance (input size).
>>
annndd this thread is proof that /g/ is just children and don't actually work as devs
>>
>>51971021
Or maybe the guys at your fortune 500 companies are still just guys who were college students a year ago, and not everyone is fucking donald knuth?
>>
>>51971051
People actually do that? He must get a lot of penis for his CS knowledge.
>>
I use it to hassle my sloppy coworkers who write 17 nested loops and wonder about abysmal repsonse times.
>>
>>51971069
He does. He was the first one to start the whole dressing like a trap makes you good at programming thing. Apparently he developed the idea after reading up on Alan Turing and the chemical castration stuff.
>>
>>51970908
filtering out retards like you from the course
>>
>>51971051

Not really, all developers should know big o considering how straightforward it is. Anyone that understands how functions work and how a loop works can understand big O notation pretty quickly.

Take the function
f(n) = n

Take the loop

for(int i = 0; i < 10; i++) {
System.out.println("OP sucks");
}

It is a single loop, so each iteration takes n time.

Therefore, big oh notation is O(n)

Now take the loop

for(int i = 0; i < 10; i++) {
for(int j = 0; j < 10; j++) {
System.out.println("OP sucks");
}
}

Now each iteration takes n^2 time since the first iteration must run and the second must run. Therefore, the big o notation is O(n^2).

The example of what I just gave is 99% of the time the version of big-o notation questions that your entry level interviews will give you.
>>
>>51971137
please be trolling
>>
>>51970908
It's a convenient way to approximate run-time for algorithms (albeit it could be with respect to memory or anything really, time-complexity, sloppily run-time, is usually concerned)... If my algorithm ∈ O(n2) and works with four objects (or whatever the algorithm operates on) then doubling the amount of objects, from four to eight, would approximately result in a quadruple increase in run-time. Big-O is very simple. Its not bullshit. Its a great way to quickly learn about strengths and weaknesses of datastructures or algorithms (http://bigocheatsheet.com for the curious one!)
>>
>>51971166

I'm not sure what you expect me to say? I've interviewed at quite a few places (with offers) and in the in the first rounds the complexity questions are never more difficult than my previous comment.
>>
>>51971137
That's still O(n), nigger.
>>
>>51970908
Its a quantifiable comparison of performance in various respects between algorithms.
>>
It's a stupid notation for a pretty intuitive concept. Of course not applicable in the slightest for the real world where any damn system dev will just call collections.sort() and call it a day.
>>
>>51971070
>17
What job could possibly require this many loops?
>>
>>51971137
Uhhhhhh, aren't those both constant? If it were:

for(int i = 0; i < n; i++) {
System.out.println("OP sucks");
}

Then it would be O(n) since the input size is unknown. But you know the input size, 10, so it would be O(10) which would reduce to O(1).

>It is a single loop, so each iteration takes n time.

Yeah ... no. Each iteration takes constant time. So, 10 iterations * O(1) = O(10) = O(1)

I'm pretty sure you have no idea what you're talking about.
>>
>>51971334
You're even more retarded.

This truly is neo/v/
>>
>>51971238
Not that guy who made the claim its O(n2), but how would that not be O(n2) with respect to printing when it's a nested for-loop with bound variables i and j? The number of printouts equals i * j, which for worst-case scenario is approximated as n * n = n2. You could cut down on the apprixmation and say that the problem belongs to O(i * j) but that's still in O(n2)
>>
>>51971366
Maybe. I thought that was how I learned it in my algo class.
>>
what a terrible accumulation of retards spouting sciolism this thread is
>>
>>51971424
>>51971366
Start this on /sci/ and I guarantee you it won't go much better
>>
This is why stackoverflow is so much better than this place. Just stick to POO IN LOO and EAT SHIT FROM HANDS anons
>>
>>51971366
not him, but it sure seems constant to me

explain yourself
>>
>>51970993
This is a terrible analogy because it compares linear problems

>>51971137
>considering how straightforward it is
Apparently not straight forward enough for you to understand it

>It is a single loop, so each iteration takes n time.
The time it takes is irrelevant

>Therefore, big oh notation is O(n)
This isn't O(n)

>Now each iteration takes n^2 time since the first iteration must run and the second must run
This isn't O(n^2)

>>51971209
It's a rough estimation that leads uninformed students to believe that linked lists are of any use
>>
I feel like every resource I look at uses big-O to mean something else. Most of the stuff I find online seems to use it mean sigma, but the CLRS book I'm using says it represents the upper bound. So, if it represents the upper bound, wouldn't n = O(n^2)? or am i retarded idk.
>>
>>51971137
U wot m8.

>>51970908
Given an algorithm which runs on a problem of size n, it's computational complexity (expressed in Big O notation) will tell the rough order of number of steps taken to run it. A function in O(n) will take at most on the order of n steps to run. More rigorously, things like O(n) are sets of functions, and functions with that time complexity or lower are a member of that set. If it's in O(log n), it's in O(n), but you usually don't use that. If you want to be more precise, there are some more symbols like Theta(n) (exactly on the order of n) and Omega(n) (at least on the order of n, maybe worse). o(n) is strictly better than O(n), little omega(n) strictly worse.

It's all big picture thinking: the constants aren't taking into account, for example you don't care that this computer is 40% faster than the other, because you can't easily classify that stuff and it depends on a lot of stuff that isn't inherent to the algorithm itself. Given two functions, one in O(n) and the other in O(n^2), the second might still be faster for certain n, even though the order is worse.

The scaling behavior is more interesting to look at. If for a certain n two functions in O(n) and O(n^2) run equally quickly, the first function will definitely be quicker if you increase the input size. Big O notation is nice for comparing algorithms.

Also, don't forget, a program is a sequence of smaller programs. If you start using your program for larger input and find out that it runs slowly, the parts of the program with the higher time complexity are the first place to start looking for optimizations.

Note that you can use Big O for more than time complexity, you can also express space required.

>>51971383
The variables i and j aren't bounded by n, they always go to 10. for(i=0;i<n;i++){for(j=0;j<n;j++){...}} clearly is in O(n^2). In general, don't say O(x*y) is O(n^2) unless you know how x and y depend on n.
>>
>>51971555
Technically, n ∈ O(n^2), but yeah.
>>
>>51971555
No, big-O is used pretty consistently.

> So, if it represents the upper bound, wouldn't n = O(n^2)?
Element of, but yes. And n^2 also is element of O(n^n^n).

This also irked Knuth and others: https://en.wikipedia.org/wiki/Big_O_notation#The_Knuth_definition
>>
>>51971522
>It's a rough estimation that leads uninformed students to believe that linked lists are of any use
Yes, I agree.

>The variables i and j aren't bounded by n, they always go to 10. for(i=0;i<n;i++){for(j=0;j<n;j++){...}} clearly is in O(n^2). In general, don't say O(x*y) is O(n^2) unless you know how x and y depend on n.
I didn't say the variables are bounded by n, I said they're bounded in the loop. They always go to 10. This doesn't matter. If you chose to hardcode variables (ie always goes to 10) or pass them as arguments doesn't change the complexity of the function with respect to a variable. In this case x and y doesn't depend on n at all, there was no n defined anywhere. I implicitly approximated the smaller of the variables as the larger one, and then renamed both to n. Which is what you usually do for worst-case complexity.
>>
>>51971555
The way I've learned it, O is the upper bound, Ω is the lower bound, Θ is the tight bound. o is asymptotically strictly less than, whereas ω is asymptotically strictly worse.

Some examples:
n is an element of both O(n) and O(n^2), but not O(log n).
n is an element of Ω(n) and Ω(log n), but not of Ω(n^2).
n is an element of Θ(n), but not of Θ(log n) or Θ(n^2).
n is an element of o(n^2), but not of o(n).
n is an element of ω(log n), but not of ω(n).

These things are all sets, and a function can be a member of that set. Of course, you can do some set stuff on that and say stuff like:
If a function is in O(log n), it is also in O(n), because O(log n) ⊆ O(n).

In practice I never use any of this, but my qt algorithms lecturer was extremely pedantic about this shit.
>>
>>51971116
>>51971116

underrated post
>>
>>51971817
Wait fuck, after learning this a year ago and typing this out now, I only just now noticed ω (omega) looks like a w for worse.

That's so fucking adorable \(°ω°)/
>>
>>51971706
>>51971659
Ok, so can we also say that for f(n) = n and g(n) = n,
n ∈ O(g(n)) = n ∈ O(n) because f(n) <= cg(n) for constant c = 2 and n >= n0=0 (ie. n < 2n)
or is that retarded
>>
>>51971817
I learned it ~ the same, but we did not get qt algorithms lecturers.

Also, the retarded part is that while this is all theoretically useful to know, most works where you could grab algorithms from don't have most of these metrics. Nor do your programming language's standard library docs. Or anyone else.

They mostly just have big-O metrics for some stuff... if you're lucky.

Ah well, most standard libs also don't have 1/4 of the things discussed in CLRS.
>>
>>51971899
check'd
>>
>>51971911
I'm not sure about calling them equal or about that "because" part.

But for g(n) = n, you can say n ∈ O(g(n)) and n ∈ O(n), sure.
>>
>>51971522
It's not that bad of an analogy when you let fat kids be the O (n^2) algorithms.
Sometimes your highschool only has fat kids and somebody has to run the race if you're gonna field a track team.
>>
>>51971366
He's correct you numbskull
>>
>Whta the fuck do you use this Big-Oh bullshit for?
*accidentally implements O(n^2) algorithm*
t. sanjeep ranjeep surjirpraveermujigonda
>>
>>51971992
PS: Customarily it is written down as O(n), you usually just make the thing behind the O as simple as possible.

If you analyzed something to be Big-Theta(1+2n+4n^2) exactly, then it's still customarily just Big-O(n^2).
>>
Never heard of it since I'm not a CS plebbian, but looked it up.
This is basically math-tier after you're done with adding. Damn, /g/ get your shit together.
>>
>>51970908
You use it to get a degree in CS, then never again after
>>
>>51972146
No, Big-O is actually used at least somewhat frequently to decide which algorithm gets work done better when you program.
>>
>>51972172
I've been a professional software engineer for 4 years and haven't used Big O notation in my entire career. The concepts are useful, but the notation is unnecessary.
>>
Formally f ∈ O(g) is just a relation on functions N -> N.
>>
This thread is proof that all the C/Lisp tards on /g/ don't know jack shit about anything.

>>51971659
*O(n) ⊂ O(n^2)

>>51972172
We all know the best sorting and search algorithms already.
>>
I had to use it one time to explain the superiority of quadtrees, octrees etc. to lists as opposed to dumping all objects in a dynamic array in reference to frustrum cullng and similar search algs
>>
>>51972237
That's because these things really only apply to 2 groups of people
1: Those that write time/resource sensitive code
2: Academics
>>
>>51972237
Such is the life of the Java codemonkey.
Thread replies: 52
Thread images: 1

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.