# Notes And Questions NCERT Class 12 Computer Science Chapter 14 Idea Of Efficiency

Please refer to Idea Of Efficiency Class 12 Computer Science Notes and important questions below. The Class 12 Computer Science Chapter wise notes have been prepared based on the latest syllabus issued for the current academic year by CBSE. Students should revise these notes and go through important Class 12 Computer Science examination questions given below to obtain better marks in exams

## Idea Of Efficiency Class 12 Computer Science Notes and Questions

The below Class 12 Idea Of Efficiency notes have been designed by expert Computer Science teachers. These will help you a lot to understand all the important topics given in your NCERT Class 12 Computer Science textbook. Refer to Chapter 14 Idea Of Efficiency Notes below which have been designed as per the latest syllabus issued by CBSE and will be very useful for upcoming examinations to help clear your concepts and get better marks in examinations.

• Learning Outcomes: Learn about the concept of efficiency in algorithms and computing in general.
• Algorithm efficiency is related to the time taken by a program for execution and space used by the algorithm.
• Performance of algorithm depends on many internal and external factors.
• Internal Factors specify algorithm’s efficiency in terms of:
• Time required to run
• Space required to run
• External Factors affecting algorithm efficiency are:
• Size of the input to the algorithm
• Speed of the computer on which it is run
• Quality of the compiler
• Mainly internal factors control the algorithm’s efficiency.
• Computational Complexity refers to measure the efficiency of an algorithm or program in terms of space and time.
• The program which uses minimum number of resources, less space and minimum time, is known as good program.
• Time Complexity:
• It is the number of elementary instructions that the program executes.
• Number depends on the size of the input data.
• The time complexity of algorithms is most commonly expressed using the big O notation.

• Time Complexity of an algorithm:

• Best case: The minimum number of steps that an algorithm can take for any input data values.
• Average case: The efficiency averaged on all possible inputs. We normally assume the uniform distribution.
• Worst case: The minimum number of steps that an algorithm can take for any input data values.

• 3 asymptotic notations are mostly used to represent time complexity of algorithms.

• Θ (theta notation)-To denote tight bound(best case)
• O(big oh notation)-To denote upper bound(worse case)
• Ω(omega notation)-To denote lower bound(average casse)

• Big O Notation:

• The growth rate determines the algorithm’s performance when its input size grows.
• It is the upper bound of an algorithm’s performance i.e. if we say an algorithm takes O(n2) time: this means that this algorithm will carry out its task taking at most n2 steps for input size n.
• Performance of the algorithms is inversely proportional to the wall clock time. Programs with a bigger O run slower than programs with a smaller O.

• Dominant term: if the algorithm’s performance is an2 +bn +c then we can say an2 is the dominant term which has maximum impact on algorithm’s performance
• Calculating complexity(Assumption : All steps take precisely take same time to execute.)
• Running time of loop is equal to the running time of statement inside the loop (including tests) multiplied by the number of iteration.
1.Loops
for i in range(n):
m=m+2
Total time=c*n=cn c=time taken by n steps n=no of iterations
i.e complexity is O(n)
2.Nested Loops:
for i in range(n):
for j in range(n):
L=L+1
Total time=c*n*n=cn2
i.e complexity is O(n2)
3.Consecutive Statements:
a=a-1
for i in range(n):
m=m+2
for j in range(n):
for k in range(n):
b=b+1
Total time= c0+c1*n+c2*n*n=c0+c1n+c2n2
i.e complexity is O(n2)
4.if-then-else statements:
if(len(list1)!=len(list2):
return False
else:
for i in range(n):

if(list1[i]!=list2[i]):
return False
Total time=c0+c1+(c2 + c3 ) *n=c0+c1++(c2 + c3 ) n
i.e complexity is O(n)
5.Logarithmic Complexity: It means that an algorithm’s performance has logarithmic factor e.g. Binary search O(log n)
• Growth rates order: O(1)<O(log n)<O(n)<O(nlogn)<O(n2) <O(n3) <O(2n)
• Wall Clock Time :
The wall-clock time is not the number of seconds that the process has spent on the CPU; it is the elapsed time, including time spent
waiting for its turn on the CPU (while other processes get to run).
Performance of algorithm is inversely proportional to the wall clock time it records for a given input size.

• Recursive program is taking less time than without recursion program. So, Program with recursion is efficient. Therefore, we can conclude that Performance of algorithm is inversely proportional to the wall clock time.