”工欲善其事,必先利其器。“—孔子《论语.录灵公》
首页 > 编程 > DSA 和 Big O 表示法简介

DSA 和 Big O 表示法简介

发布于2024-10-31
浏览:210

Intro to DSA & Big O Notation

掌握 DSA 的注意事项:

Master DSA“有资格”获得向 S/w Ers 提供的高薪。
DSA 是软件工程的主要部分。
在编写代码之前,请确保您了解大局,然后深入了解细节。
这一切都是为了直观地理解概念,然后通过任何 l/g 将这些概念翻译成代码,因为 DSA 与语言无关。
每个即将到来的概念都以某种方式与之前的概念相关联。因此,除非您通过练习彻底掌握了这个概念,否则不要跳话题或继续前进。
当我们直观地学习概念时,我们会对材料有更深入的理解,从而帮助我们更长时间地保留知识。
如果您遵循这些建议,您将不会有任何损失。

Linear DS:
Arrays
LinkedList(LL) & Doubly LL (DLL)
Stack
Queue & Circular Queue

Non-linear DS:
Trees
Graphs

大 O 表示法

理解这种表示法对于算法的性能比较至关重要。
它是一种比较算法效率的数学方法。

时间复杂度

代码运行得越快,它就越低
V. 大多数采访的impt。

空间复杂度

由于存储成本低,与时间复杂度相比很少被考虑。
需要被理解,因为面试官也可能会问你这个。

三个希腊字母:

  1. 欧米茄
  2. 西塔
  3. Omicron 即 Big-O [最常见]

算法案例

  1. 最佳情况[用 Omega 表示]
  2. 平均情况[用 Theta 表示]
  3. 最坏情况[用 Omicron 表示]

从技术上讲,不存在平均情况 Big-O 的最佳情况。它们分别用 omega 和 theta 表示。
我们总是在衡量最坏的情况。

## O(n): Efficient Code
Proportional
Its simplified by dropping the constant values.
An operation happens 'n' times, where n is passed as an argument as shown below.
Always going to be a straight line having slope 1, as no of operations is proportional to n.
X axis - value of n.
Y axis - no of operations 

// O(n)
function printItems(n){
  for(let i=1; i





## O(n^2):
Nested loops.
No of items which are output in this case are n*n for a 'n' input.
function printItems(n){
  for(let i=0; i





## O(n^3):
No of items which are output in this case are n*n*n for a 'n' input.
// O(n*n*n)
function printItems(n){
  for(let i=0; i O(n*n)


## Drop non-dominants:
function xxx(){
  // O(n*n)
  Nested for loop

  // O(n)
  Single for loop
}
Complexity for the below code will O(n*n)   O(n) 
By dropping non-dominants, it will become O(n*n) 
As O(n) will be negligible as the n value grows. O(n*n) is dominant term, O(n) is non-dominnat term here.
## O(1):
Referred as Constant time i.e No of operations do not change as 'n' changes.
Single operation irrespective of no of operands.
MOST EFFICIENT. Nothing is more efficient than this. 
Its a flat line overlapping x-axis on graph.


// O(1)
function printItems(n){
  return n n n n;
}
printItems(3);


## Comparison of Time Complexity:
O(1) > O(n) > O(n*n)
## O(log n)
Divide and conquer technique.
Partitioning into halves until goal is achieved.

log(base2) of 8 = 3 i.e we are basically saying 2 to what power is 8. That power denotes the no of operations to get to the result.

Also, to put it in another way we can say how many times we need to divide 8 into halves(this makes base 2 for logarithmic operation) to get to the single resulting target item which is 3.

Ex. Amazing application is say for a 1,000,000,000 array size, how many times we need to cut to get to the target item.
log(base 2) 1,000,000,000 = 31 times
i.e 2^31 will make us reach the target item.

Hence, if we do the search in linear fashion then we need to scan for billion items in the array.
But if we use divide & conquer approach, we can find it in just 31 steps.
This is the immense power of O(log n)

## Comparison of Time Complexity:
O(1) > O(log n) > O(n) > O(n*n)
Best is O(1) or O(log n)
Acceptable is O(n)
O(n log n) : 
Used in some sorting Algos.
Most efficient sorting algo we can make unless we are sorting only nums.
Tricky Interview Ques: Different Terms for Inputs.
function printItems(a,b){
  // O(a)
  for(let i=0; i





## Arrays
No reindexing is required in arrays for push-pop operations. Hence both are O(1).
Adding-Removing from end in array is O(1)

Reindexing is required in arrays for shift-unshift operations. Hence, both are O(n) operations, where n is no of items in the array.
Adding-Removing from front in array is O(n)

Inserting anywhere in array except start and end positions:
myArr.splice(indexForOperation, itemsToBeRemoved, ContentTobeInsterted)
Remaining array after the items has to be reindexed.
Hence, it will be O(n) and not O(0.5 n) as Big-O always meassures worst case, and not avg case. 0.5 is constant, hence its droppped.
Same is applicable for removing an item from an array also as the items after it has to be reindexed.


Finding an item in an array:
if its by value: O(n)
if its by index: O(1)

Select a DS based on the use-case.
For index based, array will be a great choice.
If a lot of insertion-deletion is perform in the begin, then use some other DS as reindexing will make it slow.

n=100 的时间复杂度比较:

O(1) = 1
O(log 100) = 7
O(100) = 100
O(n^2) = 10,000

n=1000 的时间复杂度比较:

O(1) = 1
O(log 1000) = ~10
O(1000) = 1000
O(1000*1000) = 1,000,000

主要关注这4个:
大 O(n*n):嵌套循环
大 O(n):比例
Big O(log n):分而治之
大 O(1):常数

O(n!) 通常发生在我们故意编写糟糕的代码时。
O(n*n) 是可怕的算法
O(n log n) 是可以接受的,并被某些排序算法使用
O(n) :可接受
O(log n), O(1) :最佳

所有 DS 的空间复杂度几乎相同,即 O(n)。
使用排序算法

,空间复杂度从 O(n) 到 O(log n) 或 O(1) 不等

时间复杂度因算法而异

除数字(如字符串)之外的排序的最佳时间复杂度是 O(n log n),在快速排序、合并排序、时间排序、堆排序中都是如此。

应用所学知识的最佳方法是尽可能多地编写代码。

根据每个 DS 的优缺点,选择在哪个问题陈述中选择哪个 DS。

更多信息,请参考:bigocheatsheet.com

版本声明 本文转载于:https://dev.to/mahf001/intro-to-dsa-big-o-notation-5gm9?1如有侵犯,请联系[email protected]删除
最新教程 更多>

免责声明: 提供的所有资源部分来自互联网,如果有侵犯您的版权或其他权益,请说明详细缘由并提供版权或权益证明然后发到邮箱:[email protected] 我们会第一时间内为您处理。

Copyright© 2022 湘ICP备2022001581号-3