Why Does Python Have a GIL?
Python uses reference counting for memory management.
The GIL ensures thread safety by:
-
Protecting shared memory
-
Preventing race conditions
-
Simplifying memory management
Good For:
-
I/O-bound tasks (network calls, file operations)
-
Simpler memory management
-
Faster single-threaded performance
Bad For:
-
CPU-bound tasks
-
Multi-core parallel computation
The Global Interpreter Lock (GIL) ensures that only one thread executes Python bytecode at a time in CPython. This directly affects how multithreading behaves, especially for CPU-bound tasks.
Effect on CPU-Bound Tasks
For CPU-intensive operations (calculations, loops, data processing):
-
Multiple threads cannot run in parallel
-
Threads take turns executing
-
No true multi-core utilization
Effect on I/O-Bound Tasks
For I/O operations (file reading, network requests, database calls):
-
When a thread waits for I/O, it releases the GIL
-
Another thread can execute
Multithreading works well for:
-
Web scraping
-
API calls
-
File handling
-
Network communication
3. How is multithreading achieved in Python?
Multithreading in Python is achieved using the threading module, which allows multiple threads to run concurrently within the same process.
However, due to the GIL (Global Interpreter Lock) in CPython:
-
Threads are best suited for I/O-bound tasks
-
Not ideal for CPU-bound parallel computation
4. What is multiprocessing?
Multiprocessing is a technique in Python that allows a program to run multiple processes simultaneously, enabling true parallel execution across multiple CPU cores.
Unlike multithreading, multiprocessing bypasses the GIL (Global Interpreter Lock) because each process has its own separate Python interpreter and memory space.
Why Use Multiprocessing?
-
Achieve true parallelism
-
Utilize multiple CPU cores
-
Improve performance for CPU-bound tasks
Key Characteristics
-
Each process has:
-
Its own memory space
-
Its own Python interpreter
-
-
More memory usage than threads
-
Slower process creation compared to threads
Communication Between Processes
Python provides:
-
Queue -
Pipe -
Manager -
Shared memory objects
When to Use Multiprocessing
Heavy computations
Data processing
Machine learning tasks
Image processing
5. What is monkey patching?
6. What are decorators?
Basic Syntax
@decorator_name def function_name(): pass This is equivalent to:
function_name = decorator_name(function_name) Python def my_decorator(func): def wrapper(): print("Before function call") func() print("After function call") return wrapper @my_decorator def greet(): print("Hello!") greet()
def my_decorator(func): def wrapper(): print("Before function call") func() print("After function call") return wrapper @my_decorator def greet(): print("Hello!") greet() Why Decorators Are Used
-
Logging
-
Authentication
-
Timing functions
-
Access control
-
Caching
-
Input validation
Internally, decorators work using:
-
First-class functions
-
Higher-order functions
-
Closures
-
Function reassignment
A decorator takes a function as input, wraps it inside another function, and returns the wrapped function.
8. What are closures?
A closure in Python is a function that:
-
Is defined inside another function
-
Remembers variables from the outer function
-
Even after the outer function has finished execution
Why Closures Work
Closures are possible because:
-
Python supports first-class functions
-
Functions can be returned from other functions
-
Inner functions can access outer function variables
How It Works Internally
When outer() runs:
-
messageis stored in memory -
inner()capturesmessage -
Python keeps it alive even after
outer()ends
9. What are context managers?
A context manager in Python is an object that manages resources automatically using the with statement.
It ensures that:
-
Resources are properly acquired
-
Resources are properly released
-
Cleanup happens even if an exception occurs
Common Uses
-
File handling
-
Database connections
-
Thread locks
-
Network connections
-
Managing transactions
Benefits
-
Automatic cleanup
-
Exception safety
-
Cleaner code
-
Resource management
10. How are context managers implemented?
Context managers in Python are implemented in two main ways:
-
Using a class with
__enter__()and__exit__()methods -
Using the
contextlibmodule with the@contextmanagerdecorator
Implementation Using a Class
A context manager class must define:
-
__enter__()→ Executes when entering thewithblock -
__exit__()→ Executes when exiting thewithblock
class MyContext: def __enter__(self): print("Entering block") return self def __exit__(self, exc_type, exc_value, traceback): print("Exiting block") return False # Do not suppress exceptions with MyContext(): print("Inside with block") 11. What are meta classes?
Why Metaclasses Are Used
Metaclasses allow you to:
-
Modify class creation
-
Enforce coding rules
-
Automatically register classes
-
Inject methods or attributes
-
Implement frameworks (like Django ORM)
When Are Metaclasses Used?
-
Django Models
-
ORMs
-
Enforcing coding standards
-
Automatic plugin registration
-
Singleton pattern
12. How do metaclasses differ from normal classes?
A normal class creates objects.
A metaclass creates classes.
| Normal Class | Metaclass |
|---|---|
| Defines object behavior | Defines class behavior |
| Controls instance creation | Controls class creation |
Uses __init__() | Uses __new__() (usually) |
| Used in regular OOP | Used in advanced frameworks |
Method Difference
Normal Class Methods:
-
__init__()→ Runs when object is created -
__str__(),__repr__(), etc.
Metaclass Methods:
-
__new__()→ Runs when class is created -
__init__()→ Initializes the class
13. What is scope resolution (LEGB rule)?
The LEGB rule defines the order in which Python searches for variables when they are referenced in a program.
LEGB stands for:
-
L → Local
-
E → Enclosing
-
G → Global
-
B → Built-in
| Scope | Location | Lifetime |
|---|---|---|
| Local | Inside function | Until function ends |
| Enclosing | Outer function | Until outer ends |
| Global | Module level | Program lifetime |
| Built-in | Python predefined | Always available |
14. What is memory pooling in Python?
Why Memory Pooling is Needed
Frequent memory allocation and deallocation from the OS is:
-
Slow
-
Expensive
-
Fragmentation-prone
How Memory Pooling Works
For small objects :
-
Python requests a large chunk of memory from OS
-
Divides it into smaller blocks
-
Stores them in memory pools
-
Reuses freed blocks for future objects
15. How does Python optimize memory usage?
Reference Counting
Python tracks how many references point to an object.
-
When reference count becomes zero, memory is freed immediately.
-
Provides fast cleanup.
Garbage Collection
Handles circular references that reference counting cannot clean.
Python uses Generational Garbage Collection:
-
Generation 0 → New objects
-
Generation 1 → Survived once
-
Generation 2 → Long-lived objects
Younger objects are cleaned more frequently.
Memory Pooling
For small objects (≤ 512 bytes), Python:
-
Allocates large memory chunks
-
Divides into smaller blocks
-
Reuses freed blocks
This reduces OS-level allocation overhead.
16. What is
__new__() vs __init__()?__new__() Method
-
First method called
-
Responsible for creating and returning a new instance
-
Is a static method
-
Required when subclassing immutable types (int, str, tuple)
__init__() Method
-
Called after
__new__() -
Initializes instance variables
-
Cannot return anything
| Feature | __new__() | __init__() |
|---|---|---|
| Purpose | Create object | Initialize object |
| Called first? | Yes | ❌ No |
| Returns | Must return instance | Returns None |
| Used often? | Rare | Very common |
| Used for immutable types | Yes | No |
17. What is the difference between
.py and .pyc files?.py → Human-readable Python source code
.pyc → Compiled Python bytecode (machine-readable for Python Virtual Machine)
What is a .py File?
-
Contains Python source code
-
Written by developers
-
Editable and readable
-
Interpreted by Python
What is a .pyc File?
-
Contains compiled bytecode
-
Generated automatically by Python
-
Stored inside
__pycache__folder -
Speeds up program loading
| Feature | .py | .pyc |
|---|---|---|
| Type | Source code | Compiled bytecode |
| Human readable | Yes | No |
| Editable | Yes | No |
| Created by | Developer | Python automatically |
| Location | Project folder | __pycache__ folder |
Serialization → Converting an object into a format that can be stored or transmitted
Deserialization → Converting that stored/transmitted data back into an object
Why Serialization is Needed
-
Store objects in files
-
Send data over network
-
Save data to database
-
Caching
-
Inter-process communication
To handle large files efficiently, avoid loading the entire file into memory.
Instead, use:
-
Streaming (line-by-line reading)
-
Chunk processing
-
Generators
-
Buffered I/O
-
Memory-efficient libraries
| Technique | Memory Efficient | Use Case |
|---|---|---|
| Line-by-line reading | Yes | Text files |
| Chunk reading | Yes | Large binary/text files |
| Generators | Yes | Streaming data |
| Pandas chunks | Yes | Large CSV files |
| mmap | Advanced | Huge files |
20. How do you log exceptions?
Exceptions are logged using Python’s built-in logging module, typically inside a try-except block using:
-
logging.error() -
logging.exception() -
logger.error()
Logging Levels
| Level | Purpose |
|---|---|
| DEBUG | Detailed info |
| INFO | General info |
| WARNING | Potential issue |
| ERROR | Serious issue |
| CRITICAL | Very serious issue |
Why Use Logging Instead of print()
-
Better debugging
-
Log levels
-
Stores history
-
Supports files, console, remote systems
-
Production-ready
What is an Infinite Loop?
An infinite loop occurs when the loop condition never becomes False.
To avoid infinite loops:
-
Always define a proper termination condition
-
Ensure the loop variable updates correctly
-
Use
breakwhen necessary -
Add safety checks or timeouts
Debugging Infinite Loops
-
Add print statements
-
Use logging
-
Use debugger
-
Check condition carefully
Common Causes
-
Forgetting to update loop variable
-
Incorrect condition
-
Floating-point comparison issues
-
Wrong indentation
22. How do you check memory usage of an object?
You can check the memory usage of an object using:
-
sys.getsizeof()→ Basic size -
__sizeof__()→ Internal size -
pympler→ Deep memory size -
tracemalloc→ Track memory allocations
| Method | Measures Nested Objects | Use Case |
|---|---|---|
sys.getsizeof() | No | Quick check |
__sizeof__() | No | Internal size |
pympler.asizeof() | Yes | Accurate deep size |
tracemalloc | Tracks allocations | Debugging |
23. What are weak references?
Why Weak References Are Useful
-
Avoid circular reference issues
-
Implement caching systems
-
Manage large object graphs
-
Observer patterns
-
Avoid memory leaks
Weak Reference vs Strong Reference
| Feature | Strong Reference | Weak Reference |
|---|---|---|
| Increases ref count | Yes | No |
| Prevents GC | Yes | No |
| Keeps object alive | Yes | No |
| Used normally | Yes | Special cases |
24. What are Python descriptors?
A descriptor is an object that defines how attribute access is handled in Python classes using special methods:
-
__get__() -
__set__() -
__delete__()
Why Descriptors Exist
When you write:
Python internally follows a special lookup mechanism.
Descriptors allow you to customize that behavior.
They are the foundation of:
-
@property -
@staticmethod -
@classmethod -
Many ORM frameworks
Why Descriptors are useful
-
Enable reusable validation logic
-
Core mechanism behind properties
-
Used in Django ORM
-
Provide advanced attribute control
25. What is method overloading in Python?
Method Overloading vs Method Overriding
| Feature | Overloading | Overriding |
|---|---|---|
| Same class | Yes | No |
| Same method name | Yes | Yes |
| Different parameters | Yes | Same parameters |
| Supported in Python | No (traditional) | Yes |
26. What is method overriding?
Why Method Overriding is Used
-
Customize inherited behavior
-
Implement specific logic
-
Achieve runtime polymorphism
-
Extend functionality
Important Rules
-
Method name must be the same
-
Parameter list should match (recommended)
-
Must use inheritance
27. How does Python support duck typing?
Python supports duck typing through its dynamic typing system, meaning Python cares about what an object can do (its methods/behavior), not what type it is.
What is Duck Typing?
In duck typing:
-
No need for inheritance
-
No need for explicit interfaces
-
Only required behavior matters
Why Python Supports Duck Typing
Because Python:
-
Is dynamically typed
-
Checks types at runtime
-
Resolves methods dynamically
-
Does not require strict interfaces
28. How do you optimize Python code?
To optimize Python code:
-
Measure performance (profiling)
-
Improve algorithms (reduce complexity)
-
Use efficient data structures
-
Minimize unnecessary computations
-
Use built-in functions and libraries
-
Apply concurrency when needed
29. What is async and await?
async and await are keywords used in Python to write asynchronous (non-blocking) code using the asyncio framework.When to Use Async
Web servers
API calls
Database queries
File/network operations
Event-driven programming is a programming paradigm where the flow of execution is determined by events, such as:
-
User actions (clicks, key presses)
-
System events
-
Network responses
-
Messages or signals
Core Components of Event-Driven Programming
Event → Something that happens
Event Listener → Waits for event
Event Handler → Executes when event occurs
Event Loop → Continuously checks for events
31. How does Python handle concurrency?
Python handles concurrency using:
Multithreading
Multiprocessing
Asynchronous programming (asyncio)
Each is suitable for different types of tasks.
What is Concurrency?
Concurrency means Managing multiple tasks at the same time.
| Feature | Threading | Multiprocessing | Asyncio |
|---|---|---|---|
| GIL affected | Yes | No | No (single thread) |
| True parallelism | No | Yes | No |
| Memory usage | Low | Higher | Very low |
| Best for | I/O-bound | CPU-bound | High I/O concurrency |
32. What are coroutines?
A coroutine is a special type of function that can:
-
Pause its execution
-
Yield control back to the caller
-
Resume execution later
When to Use Coroutine
Web servers
API calls
Chat applications
High-concurrency network systems
Database queries
33. How does Python differ from Java/C++ internally?
Internally, Python differs from Java and C++ in terms of:
-
Execution model
-
Memory management
-
Typing system
-
Performance
-
Compilation process
| Feature | Python | Java | C++ |
|---|---|---|---|
| Typing | Dynamic | Static | Static |
| Type Checking | Runtime | Compile-time | Compile-time |
| Flexible variables | Yes | No | No |
Why Cython is Used
Python is slower for CPU-heavy tasks because:
-
It is dynamically typed
-
It runs on a virtual machine
-
It has GIL limitations
Cython helps by:
-
Converting Python code to C
-
Allowing static type declarations
-
Reducing Python runtime overhead
How Cython Works
↓
Converted to C Code
↓
Compiled to Machine Code
↓
Imported as Python Module
| Feature | Python | Cython |
|---|---|---|
| Typing | Dynamic | Static + Dynamic |
| Speed | Slower | Much Faster |
| Compilation | Bytecode | Compiled to C |
| Low-level access | No | Yes |
35. What are Python bytecodes?
It is executed by the Python Virtual Machine (PVM).
| Feature | Bytecode | Machine Code |
|---|---|---|
| Executed by | Virtual Machine | CPU |
| Platform dependent | No | Yes |
| Language specific | Yes | No |
| Human readable | No | No |
36. How does Python execute code internally?
Important Internal Components
| Component | Role |
|---|---|
| Parser | Converts source to AST |
| Compiler | Converts AST to bytecode |
| Bytecode | Intermediate instructions |
| PVM | Executes bytecode |
| Garbage Collector | Manages memory |
37. What are common Python performance bottlenecks?
Common Python performance bottlenecks include:
-
Inefficient algorithms
-
Excessive loops in Python
-
Improper data structures
-
Heavy CPU-bound operations
-
Blocking I/O
-
GIL limitations
-
Excessive object creation
-
Memory inefficiency
| Bottleneck | Solution |
|---|---|
| Bad algorithm | Improve complexity |
| Wrong data structure | Use set/dict |
| CPU-heavy loops | Use multiprocessing |
| Blocking I/O | Use async/threading |
| String concatenation | Use join() |
| Memory overload | Use generators |
38. How do you design scalable Python applications?
To design scalable Python applications:
-
Use modular architecture
-
Separate concerns (clean architecture)
-
Optimize performance
-
Use asynchronous or concurrent processing
-
Scale horizontally
-
Use caching and load balancing
-
Monitor and profile continuously
Scalability Types
| Type | Meaning |
|---|---|
| Vertical Scaling | Increase server power |
| Horizontal Scaling | Add more servers |
| Functional Scaling | Split into services |
39. What is dependency injection in Python?
This promotes:
-
Loose coupling
-
Better testability
-
Cleaner architecture
Production-level Python code should be:
-
Clean and readable
-
Modular and scalable
-
Well-tested
-
Secure
-
Performant
-
Monitored and maintainable
Production Code Checklist
| Area | Must Have |
|---|---|
| Code quality | Clean & modular |
| Testing | Unit & integration |
| Logging | Structured logging |
| Error handling | Proper exception management |
| Security | Environment variables |
| Performance | Profile & optimize |
| Deployment | CI/CD pipeline |