Skip to main content

Standard Normal Distribution with examples using Python

Positional Arguments in Python

Positional Arguments in Python

Positional Arguments in Python allow arguments to be passed to a function implicitly based on the order in which they appear in the function argument list. When you define a Python function, you expect some values or arguments to be passed to the function. Let's define a function to understand this better.


def print_stats(name,age,type_batsman, avg_runs):
  print(f"Name of cricketer is {name}.")
  print(f"Age of {name} is {age} years.")
  print(f"{name} is {type_batsman} handed bastsman.")
  print(f"Career ODI Average for {name} is {avg_runs} runs.") 
  

In this function, name, age , type_batsman , avg_runs are called positional arguments. The function expects the order to be maintained when you pass in parameters to this function. Let's call this function to see this.


print_stats("Sachin Tendulkar", 49, "right", 44.8)

Name of cricketer is Sachin Tendulkar.
Age of Sachin Tendulkar is 49 years.
Sachin Tendulkar is Right handed bastsman.
Career ODI Average for Sachin Tendulkar is 44.8 runs.

What would happen if you mess up the order when passing arguments to this function?


print_stats(49, "Sachin Tendulkar", 44.8, "Right")

Name of cricketer is 49.
Age of 49 is Sachin Tendulkar years.
49 is 44.8 handed bastsman.
Career ODI Average for 49 is Right runs.

Clearly, the results would not make sense. So, if you are passing parameters to a function accepting postional arguments , we must strictly maintain the order. What if we pass less parameters than what is expected by this function. Let's see.


print_stats("Sachin Tendulkar", 49,"Right")
TypeError                                 Traceback (most recent call last)
ipython-input-12-247478ed6eaf in module
----> 1 print_stats("Sachin Tendulkar", 49,"Right")

TypeError: print_stats() missing 1 required positional argument: 'avg_runs'

The function expects avg_runs parameter to be passed but it does not receive it when the function call is made leading to a TypeError . Now let's see what happens if we supply more parameters than is expected.


print_stats("Sachin Tendulkar", 49,"Right" ,44.8, 56)

TypeError                                 Traceback (most recent call last)
ipython-input-14-1900fa90b236 in module
----> 1 print_stats("Sachin Tendulkar", 49,"Right" ,44.8, 56)

TypeError: print_stats() takes 4 positional arguments but 5 were given

The function expected 4 positional arguments , but we supplied 5 parameters. It does not know what to do with the 5th parameter, so it throws a TypeError. Always double check before you pass parameters to a function accepting positional arguments. It can save you a lot of effort.

This concludes the discussion on Positional Arguments in Python . In the next post, we will take up Keyword Arguments in Python.

Comments

Popular posts from this blog

How to adopt Embeddings for Categorical features in Tabular Data using PyTorch's nn.Embedding( )-- Part 2

In the previous post , we set up the context to utilize embeddings for categorical features. In this post, we will figure out how to create these embeddings and combine them with other continuous features to build a neural network model. Dataset Download We will utilize the UCI machine learning repo which has a dataset on credit card default for customers in Taiwan. This dataset is also available in Kaggle . Metadata about this dataset is available on the respective websites. To follow this post, it is recommended to download the dataset from Kaggle. Most of the features are self explanatory. Embedding Creation A few definitions first. Levels in a categorical feature represent unique values available for that categorical feature. For e.g. MARRIAGE has levels 0,1,2,3. Each level of a categorical feature is represented by a vector of numbers. So, if you stack up all the levels together and all the vectors together, you can imagine levels to be a colum

How to adopt Embeddings for Categorical features in Tabular Data using PyTorch's nn.Embedding( )-- Part 1

How to adopt Embeddings for Categorical features in Tabular Data using PyTorch's nn.Embedding( )-- Part 1 In this post, we will talk about using embeddings for categorical features using PyTorch. This post will be broken down into following parts. Dataset Download Data Understanding Data Preprocessing Embedding Creation Define Dataset and Dataloaders in PyTorch Neural Network definition in PyTorch The Training Loop Model Validation The idea about using Embeddings from Categorical Features was first mooted during a Kaggle contest and a paper was also published on this. In the context of NLP and word embeddings, we represent each word in an n dimesnional vector space. In a similar way, we can represent any categorical feature in an n dimesnional vector space as well. 1. Dataset Download We will utilize the UCI machine learning repo which has a dataset on credit card default for customers in Taiwan. This dataset is also av

Standard Normal Distribution with examples using Python

Standard Normal Distribution with examples In our previous post, we talked about Normal Distribution and its properties . In this post, we extend those ideas and discuss about Standard Normal Distribution in detail. What is a Standard Normal Distribution? A Normal Distribution with mean 0 and standard deviation 1 is called a Standard Normal Distribution . Mathematicallty, it is given as below. Fig 1:Standard Normal Probability Distribution Function For comparison, have a look at the Normal Probability Distribution Function. If you substitute mean as 0 ,standard deviation as 1, you derive the standard normal probability distribution function Fig 2: Normal Probability Distribution Function Need for a standard normal probability distribution function We need to extract probability information about events that we are interested in. For this, first we need to convert any normal random variable