Skip to main content

Standard Normal Distribution with examples using Python

What is a set comprehension and dictionary comprehension?

What is a set comprehension and dictionary comprehension?

In our earlier posts (Part 1 and Part 2) , we discussed extensively on list comprehension. In this post, I would like to conclude these ideas and see how we can extend it to other iterables in Python. A set comprehension works in a similar fashion. A set data structure in Python stores unique elements. Let's create a list of numbers


list_nos = [1,2,3,4,4,4,5,6,6,2,2,2,1,1,0]

We create a set using set comprehension as below.

set_nos = {element for element in list_nos}
print(set_nos)
---> {0, 1, 2, 3, 4, 5, 6}

Another example of a set comprehension can be seen below. Let's create a sentence or rather a string in Python.


sentence = "I am having a great day"
set_sentence = {element for element in sentence}
print(set_sentence)
---> {'t', 'e', 'h', 'I', 'm', 'v', 'y', ' ', 'n', 'g', 'a', 'd', 'i', 'r'}

List comprehension can be extended to dictionaries as well in Python. Let's create a list of strings.


list_strings = ["Hi", "Hello", "Wait", "Dry"]
dict_comprehension = {key:value for key,value in enumerate(list_strings)}
print(dict_comprehension)
---> {0: 'Hi', 1: 'Hello', 2: 'Wait', 3: 'Dry'}

In a dictionary comprehension, we enclose the expression in a "{ }" just like in a set. However, the only difference is, a dictionary has a key value pair while a set has only elements.
In general, list comprehension is widely used and is very popular. Set and dictionary comprehension are not so popular, but we must be aware of it.

Comments

Popular posts from this blog

How to adopt Embeddings for Categorical features in Tabular Data using PyTorch's nn.Embedding( )-- Part 2

In the previous post , we set up the context to utilize embeddings for categorical features. In this post, we will figure out how to create these embeddings and combine them with other continuous features to build a neural network model. Dataset Download We will utilize the UCI machine learning repo which has a dataset on credit card default for customers in Taiwan. This dataset is also available in Kaggle . Metadata about this dataset is available on the respective websites. To follow this post, it is recommended to download the dataset from Kaggle. Most of the features are self explanatory. Embedding Creation A few definitions first. Levels in a categorical feature represent unique values available for that categorical feature. For e.g. MARRIAGE has levels 0,1,2,3. Each level of a categorical feature is represented by a vector of numbers. So, if you stack up all the levels together and all the vectors together, you can imagine levels to be a colum

How to adopt Embeddings for Categorical features in Tabular Data using PyTorch's nn.Embedding( )-- Part 1

How to adopt Embeddings for Categorical features in Tabular Data using PyTorch's nn.Embedding( )-- Part 1 In this post, we will talk about using embeddings for categorical features using PyTorch. This post will be broken down into following parts. Dataset Download Data Understanding Data Preprocessing Embedding Creation Define Dataset and Dataloaders in PyTorch Neural Network definition in PyTorch The Training Loop Model Validation The idea about using Embeddings from Categorical Features was first mooted during a Kaggle contest and a paper was also published on this. In the context of NLP and word embeddings, we represent each word in an n dimesnional vector space. In a similar way, we can represent any categorical feature in an n dimesnional vector space as well. 1. Dataset Download We will utilize the UCI machine learning repo which has a dataset on credit card default for customers in Taiwan. This dataset is also av

Standard Normal Distribution with examples using Python

Standard Normal Distribution with examples In our previous post, we talked about Normal Distribution and its properties . In this post, we extend those ideas and discuss about Standard Normal Distribution in detail. What is a Standard Normal Distribution? A Normal Distribution with mean 0 and standard deviation 1 is called a Standard Normal Distribution . Mathematicallty, it is given as below. Fig 1:Standard Normal Probability Distribution Function For comparison, have a look at the Normal Probability Distribution Function. If you substitute mean as 0 ,standard deviation as 1, you derive the standard normal probability distribution function Fig 2: Normal Probability Distribution Function Need for a standard normal probability distribution function We need to extract probability information about events that we are interested in. For this, first we need to convert any normal random variable