Skip to content

Blog

All about attention

Overview

Attention layers are now used over RNNs and even CNNs to speed up processing. In this blog we will see how attention layers are implemented.

Working of Attention layers

There three inputs to attention layers:

  1. Query: Represents the "question" or "search term" for determining which parts of the input sequence are relevant.
  2. Key: Represents the "descriptor" of each input element, used to decide how relevant each input element is to the query.
  3. Values: Contains the actual information or representation of each input element that will be passed along after attention is computed.

Given a Query and Key we calculate the similarity, this allows us to use the key with the max similarity and use its value for attention.

\[ \text{Score}(Q, K) = Q \cdot K^\top \]

The above equation results in matrix describing how much importance a query gives to a key. In the equation Q is the query and K is the key.

The next step is scaling, we perform scaling to avoid large values, larger values require more resources for computation, So now the equation takes the following shape:

\[ \text{Scaled Score}(Q, K) = \frac{Q \cdot K^\top}{\sqrt{d_k}} \]

Where \(d_k\) is the dimensionality of the Key vectors.

The scores are passed through a softmax function to convert them into probabilities (attention weights). These probabilities determine the contribution of each input element. The equation now takes the following form:

\[ \text{Attention Weights} = \text{softmax}\left(\frac{Q \cdot K^\top}{\sqrt{d_k}}\right) \]

Overall the equation would look something like this:

\[ \text{Attention}(Q, K, V) = \text{softmax}\left(\frac{QK^\top}{\sqrt{d_k}}\right) V \]
QKV
Figure 1: Query Key Value

Scaled Dot PRoduct Attention
Figure 2: Flow of calculating Attention in Scaled Dot Product Attention
Query Key maping
Figure 2: Example mapping similar query-key value pairs

Lets try to understand this with an analogy. Consider the example where you are visiting a library and ask for a book. You say "I want a book about science fiction", this is analogous to Query. The library uses the description of each book (Key) in the library that is similar to the customers query to recommend books that fit the genre of science fiction and provides the list of these books to the customer (Value).

Queries, Keys, and Values are computed as linear transformations of the input embeddings (or outputs of the previous layer):

$$ Q = XW_Q, \quad K = XW_K, \quad V = XW_V $$

where \(X\) is the input, and \(W_Q\), \(W_K\), \(W_V\) are learned weight matrices.

Summary

  1. Attention is a layer that lets a model focus on what's important
  2. Query, Values and Keys are used for information retrieval insde the attention layer.

LLaVA

Overview

LLaVA (Large Language and Vision Assistant) was first introduced in the paper "Visual Instruction Tuning".

What is Visual Instruction Tuning?

Visual instruction tuning is a method used to fine-tune a large language model, enabling it to interpret and respond to instructions derived from visual inputs.

One example is to ask a machine learning model to describe an image.

LLaVA

As already established LLaVA is a multimodal model. LLaVA was trained on a small dataset. Despite this it can perform image analysis and respond to questions.

Architecture

The LLaVA has the following components: 1. Language model 2. Vision Encoder 3. Projection

We use the Llama as the language model, which is a family of autoregressive LLMs released by Meta AI.

The vision encoder is implemented by CLIP visual encoder ViT-L/14. The encoder extracts visual features and connects them to language embeddings through a projection matrix. The projection component translates visual features into language embedding tokens, thereby bridgin the gap between text and images.

Training

Two stages of training:

  1. Pre-training for Feature Alignment: LLaVA aligns visual and language features to ensure compatibility in this initial stage.
  2. Fine-tune end-to-end: The second training stage focuses on fine-tuning the entire model. At this stage the vision encoder's weights remain fixed

LLaVA-1.5

In LLaVA-1.5 there are two significant changes: 1. MLP vision-language connector 2. Trained for academic task-oriented data.

The linear projection layer is replaced with a 2 layer MLP.

LLaVA 1.6 (LLaVA-NeXT)

n addition to LLaVA 1.5, which uses the Vicuna-1.5 (7B and 13B) LLM backbone, LLaVA 1.6 considers more LLMs, including Mistral-7B and Nous-Hermes-2-Yi-34B. These LLMs possess nice properties, flexible commercial use terms, strong bilingual support, and a larger language model capacity. It allows LLaVA to support a broader spectrum of users and more scenarios in the community. The LLaVA recipe works well with various LLMs and scales up smoothly with the LLM up to 34B.

Here are the performance improvements LLaVA-NeXT has over LLaVA-1.5:

Increasing the input image resolution to 4x more pixels. This allows it to grasp more visual details. It supports three aspect ratios, up to 672x672, 336x1344, and 1344x336 resolution. Better visual reasoning and zero-shot OCR capability with multimodal document and chart data. Improved visual instruction tuning data mixture with a higher diversity of task instructions and optimizing for responses that solicit favorable user feedback. Better visual conversation for more scenarios covering different applications. Better world knowledge and logical reasoning. Efficient deployment and inference with SGLang.

Other variants of LLaVA: 1. LLaVA-Med 2. LLaVA-Interactive

Reference

  1. A. Acharya, “LLAVA, LLAVA-1.5, and LLAVA-NeXT(1.6) explained,” Nov. 04, 2024. https://encord.com/blog/llava-large-language-vision-assistant/
  2. Wikipedia contributors, “Llama (language model),” Wikipedia, Jan. 01, 2025. https://en.wikipedia.org/wiki/Llama_(language_model)

Introduction to Hugging Face

Overview

Hugging Face is a leading platform in natural language processing (NLP) and machine learning (ML), providing tools, libraries, and models for developers and researchers. It is widely known for its open-source libraries and community contributions, facilitating the use of pre-trained models and accelerating ML workflows.

Applications of Hugging Face:

  • Sentiment Analysis
  • Text Summarization
  • Machine Translation
  • Chatbots and Virtual Assistants
  • Image Captioning (via VLMs)
  • Healthcare, legal, and financial domain-specific NLP solutions

Why Hugging Face Matters:

Hugging Face democratizes access to advanced AI tools, fostering innovation and collaboration. With its open-source ethos, it has become a go-to resource for researchers and developers alike, empowering them to tackle complex challenges in AI and ML effectively.

Hugging Face can be used with both TensorFlow and PyTorch.

Hugging Face AutoClasses

Hugging Face AutoClasses are an abstraction that simplifies the use of pre-trained models for various tasks, such as text classification, translation, and summarization. They automatically select the appropriate architecture and configuration for a given pre-trained model from the Hugging Face Model Hub.

Commonly Used AutoClasses:

1. AutoModel
  • For loading generic pre-trained models.
  • Use case: Extracting hidden states or embeddings.
from transformers import AutoModel
model = AutoModel.from_pretrained("bert-base-uncased")
2. AutoModelForSequenceClassification
  • For text classification tasks.
  • Use case: Sentiment analysis, spam detection, etc.
from transformers import AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained("bert-base-uncased")
3. AutoTokenizer
  • Automatically loads the appropriate tokenizer for the specified model.
  • Handles tokenization, encoding, and decoding.
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
4. AutoModelForQuestionAnswering
  • For question-answering tasks.
  • Use case: Extracting answers from context.
from transformers import AutoModelForQuestionAnswering
model = AutoModelForQuestionAnswering.from_pretrained("bert-large-uncased-whole-word-masking-finetuned-squad")
5. AutoModelForSeq2SeqLM
  • For sequence-to-sequence tasks like translation or summarization.
from transformers import AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("t5-small")
6. AutoModelForTokenClassification
  • For tasks like Named Entity Recognition (NER) or Part-of-Speech (POS) tagging.
from transformers import AutoModelForTokenClassification
model = AutoModelForTokenClassification.from_pretrained("dbmdz/bert-large-cased-finetuned-conll03-english")
7. AutoModelForCausalLM
  • For language modeling tasks that generate text.
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("gpt2")
8. AutoProcessor (for Multimodal Models)
  • Loads processors for tasks involving images, text, or both.
  • Example: Vision-Language Models (e.g., CLIP).
from transformers import AutoProcessor
processor = AutoProcessor.from_pretrained("openai/clip-vit-base-patch32")

Use Cases in Projects:

  • VLMs: Use AutoProcessor and AutoModel for image-text embedding or image captioning tasks.
  • Healthcare: Use AutoModelForSequenceClassification for text classification tasks like predicting medical conditions based on clinical notes.

Why use Transformers?

Traditionally to process text we RNNS but as the window size increases we see the problem of vanishing gradients. Additionally, they are slow. Transformers are able to address these concerns.

Stable Diffusion Understanding

Overview

Stable Diffusion has become so popular for image generation. It is the go to model for developers. It is a latent diffusion model that generates AI images for text. Sometimes you can also use an image and text to generate images.

Capabilities of Stable Diffussion

Stable diffusion is a text-to-image model. Given a text it will produce an image.

Stable Diffusion Text to Image
Figure 1:Basic Workflow of Stable Diffusion

Stable diffusion belongs to a class of deep learning models called diffusion models. These are models that are capable of generating new data that is similar to the training data. These models are so named since they use diffusion based mechanics we see in physics. We see two types of diffusion here: 1. Forward Diffusion 2. Reverse Diffusion

Forward Diffusion

Forward diffusion is the process that adds noise to an image in steps such that it gradually becomes unrecognizable. It is similar to the process where you drop ink on tissue paper the ink eventually spreads out.

Forward Diffusion Process
Figure 2: Stable diffusion Forward diffusion process taken from here
Forward Diffusion Process
Figure 3: Drop of ink from the nib of the pen spreading on the tissue paper (AI Generated from LLama 3.2)

Reverse Diffusion

Reverse diffusion is the opposite of Forward Diffusion. So rather than adding noise, it removes noise gradually from an image.

Reverse Diffusion Process
Figure 4: Stable diffusion Reverse diffusion process taken from here

Training process of Stable Diffusion

Adding noise is simple process and does not require explicit training. But how do get the old image back from a noisy image. We need to remove the noise from the image. To put it mathematically.

Reverse Diffusion Process Equation
Figure 5: Stable diffusion Reverse diffusion High Level Equation

So what we need to do is predict the amount of noise that needs to be removed to produce the original almost noiseless image. We use a noise predictor which for stable diffusion is a U-net model.

U-Net Model

It is a widely used deep learning model for image segmentation. The primary purpose of the model was t o address the challenge of limited data in healthcare. This network allows you to use a smalled dataset for training while maintaining the speed and accuracy of the model.

The U-Net model consists of 2 paths:

  1. Contracting Path
  2. Expansive Path

The contracting path consist of encoders, that capture the relevant information and encode it. The expansive path contains decoders the decode the encoded information and also use the information from the contracting path via the skip connections to generate a segmentation map.

U-net Model
Figure 6: U-net model taken from here

U-net Model Encoder
Figure 7: U-net model Encoder Architecture

U-net Model Decoder
Figure 8: U-net model Decoder Architecture

Cost of running the model

Diffusion models like Google’s Imagen and Open AI’s DALL-E are in pixel space. They have used some tricks to make the model faster but still not enough. Whereas, Stable Diffusion is a latent diffusion model. Instead of operating in the high-dimensional image space, it first compresses the image into the latent space. The latent space is 48 times smaller so it reaps the benefit of crunching a lot fewer numbers. That’s why it’s a lot faster. We use a Variational Autonencoders (VAE).

To summarise we use U-net in the image space for faster generation we make use of the latent space, for this we use VAE. U-Net is still used as the noise predictor.

Variational Autoencoders

Like U-net these also have encoders and decodes, the noise is added to latent vector and is later decoded to generate the images.

VAE overview
Figure 9: VAE Working

Does using latent space cause loss in information?

It might seem that while using the latent space we are loosing a lot of information, however thats not the case. It might seem that images are random but they are regular in nature. For Example: A face of any species has a mouth, ears and a nose. This is better explained by the Manifold Hypothesis.

Reverse Diffusion in Latent Space

Here’s how latent reverse diffusion in Stable Diffusion works.

  1. A random latent space matrix is generated.
  2. The noise predictor estimates the noise of the latent matrix.
  3. The estimated noise is then subtracted from the latent matrix.
  4. Steps 2 and 3 are repeated up to specific sampling steps.
  5. The decoder of VAE converts the latent matrix to the final image.

The noised predictor here is still U-Net.

So far we have seen only image generation process which is called the unconditioned process. In the following sections we will see how we can condition for text i.e. given a text the model should generate an image.

Text Conditioning

To be able to generate images using the text prompts we need to perform the preprocessing steps in figure 10. In the figure the Tokenizer and Embedder are implemented by a Contrastive Language-Image Pretraining model (CLIP). It should be noted here since we are dealling with a text input the convulutional layers are replaced by cross attention layers to help establish relationship between different words in a sentence. Attention layers are the new feature extracture layers, they are going to replace RNNs and CNNs as they are faster at processing and get rid of any inductive biases due the structure of neural network.

There are other forms of conditioning as well

VAE overview
Figure 10: Text Conditioning steps

Summary

To summarize how stable diffusion creates images here are the steps:

  1. Given a text or image we generate a random vectors in the latent space this is done through VAE encoder.
  2. U-NET then predicts the noise that is added to this vector.
  3. Given the amount of noise that is added we remove the noise for the latent vector.
  4. Steps 2 and 3 are repeated for a certain number of sampling steps
  5. Finally, the decoder of VAE converts the latent image back to pixel space. This is the image you get after running Stable Diffusion.

References

  1. Andrew, “How does Stable Diffusion work?,” Stable Diffusion Art, Jun. 10, 2024. https://stable-diffusion-art.com/how-stable-diffusion-work/
  2. GeeksforGeeks, “UNET Architecture explained,” GeeksforGeeks, Jun. 08, 2023. https://www.geeksforgeeks.org/u-net-architecture-explained/
  3. O. Ronneberger, P. Fischer, and T. Brox, “U-NET: Convolutional Networks for Biomedical Image Segmentation,” arXiv.org, May 18, 2015. https://arxiv.org/abs/1505.04597
  4. Wikipedia contributors, “Manifold hypothesis,” Wikipedia, Aug. 01, 2024. https://en.wikipedia.org/wiki/Manifold_hypothesis

Computational Graphs

These are Directed Graphs that helps map out dependencies for mathematical computations. For Example let us consider the following set of equations:

  1. Y=(a-b)*(a+b)
  2. Let d=(a-b) and e=(a+b)

Our dependency graph will look as follows:

Graph Example

The lower nodes are evaluated first then the higher nodes are evaluated.

Let us consider how this works when performing chain differentiation when it comes to neural networks.

To review chain differentiation consider the following equation:

  1. y = \(u^4\)
  2. u = 3x + 2

Performing chain rule differentiation with respect to x we would get the follolwing:

We first perform partial differentiation of u with respect to x

\[\frac{\partial u}{\partial x} = 3 \]

Then perform partial differentiation of y with respect to u

\[\frac{\partial y}{\partial u} = 4u^3\]

Can be re-written as:

\[ \frac{\partial y}{3\partial x} = 4u^3\]
\[ \frac{\partial y}{\partial x} = 12 u^3\]

if x = 3.0

u = 11

\(\frac{\partial y}{\partial x} = 15972\)

Representing the above steps in a computational graph we get the following:

Chained Computational Graph

How do we implement this? Luckily this has already been implemented for us in Tensorflow and Pytorch.

There are 2 implementations of Computational Graphs:

  1. Static Computational Graphs - Graphs are constructed once befor the execution of the model.
  2. Dynamic Computational Graphs - Graphs are constructed on the fly.

Tensorflow Computation Graph implementation.

import tensorflow as tf
2024-08-27 18:45:09.326809: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-08-27 18:45:09.357051: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-08-27 18:45:09.365983: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2024-08-27 18:45:09.395484: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
x = tf.constant(3.0)
with tf.GradientTape(persistent=True) as tape:
    tape.watch(x)
    u = 3*x + 2
    y = u ** 4
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1724784312.756873    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784312.767002    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784312.767097    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784312.777530    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784312.777763    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784312.777895    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784313.020475    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
I0000 00:00:1724784313.020614    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
2024-08-27 18:45:13.020636: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2112] Could not identify NUMA node of platform GPU id 0, defaulting to 0.  Your kernel may not have been built with NUMA support.
I0000 00:00:1724784313.020750    1081 cuda_executor.cc:1001] could not open file to read NUMA node: /sys/bus/pci/devices/0000:01:00.0/numa_node
Your kernel may have been built without NUMA support.
2024-08-27 18:45:13.020821: I tensorflow/core/common_runtime/gpu/gpu_device.cc:2021] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 1767 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 3050 Laptop GPU, pci bus id: 0000:01:00.0, compute capability: 8.6
g = tape.gradient(y,x)
g
<tf.Tensor: shape=(), dtype=float32, numpy=15972.0>





Pytorch Computation Graph Implementation.

import torch
x = torch.tensor(3.0, requires_grad=True)
u = 3*x +2
y = u**4
y.backward()
x.grad
tensor(15972.)


Hypothesis Testing

It's a statistical method used to determine whether a hypothesis about a population is true or not. It involves collection data, analyzing it, and making a decision based on a the evidence

Steps

Step 1: State your null and alternate hypothesis

The null hypothesis is a prediction of no relationship between variables you are interested in. The alternate hypothesis on the other hand is your hypothesis that predicts a relationship between variables.

Examples
  1. You want to test whether there is relationship between gender and height. Based on your knowledge of human physiology, taller than women. To test this hypothesis you restate it as:

  2. H0 : Men are, on average, not taller than women

  3. Ha: Men are, on average, taller than women.
Some Guidelines when using mathematical symbols
H0 Ha
Equal (=) Not equal (\(\neq\))
Greater Than or equal to (\(\geq\)) Less than (\(\lt\))
Less than or equal to (\(\leq\)) Greater than (\(\gt\))
Examples

We want to test whether the mean GPA of students in American colleges is different from 2.0.

The null and alternative hypothese are

H0: \(\mu\) = 2.0

Ha: \(\mu\) \(\ne\) 2.0

Steps 2 : Perform an appropriate statistical test

For this step we perform something known as the t-test. A t-test is any statistical hypothesis test in which the test statistic follows a t-distribution under the null hypothesis.

A t-test is most commonly applied when the test statistic would follow a normal distribution if the value of a scaling term in the test statistic were known.

The t-test can be used to determine if the means of two sets of data are significantly different from each other.

An independent Samples t-test compares the means for two groups.

A paired sample t-test compares means from the same group at different times

A one sample t-test test the mean of a single group against a known mean.

T-score is a ration between 2 groups and the difference within the groups.

The larger the t score, the more difference there is between groups.

The smaller the t score, the more similarity there is between groups.

Every t-score has a p-value to go with it.

A p-value is th probability that results that your sample data occured by chance.

P-values are from 0% to 100%

Low p-values are good. They indicate that your data did not occur by chance.

Step 3: Decide Whether to reject or accept your null hypothesis.

To understand this step let us solve a problem:

Suppose a sample of n students were give a diagnostic test before studying a particular module and then again after completing the module. We want to find out if in general teaching leads to improvements in students knowledge/skills. We can use the results from our sample of students to draw concludsion about the impact of this module in general.

So since we are calculating the mean of the same sample at different points in time we will be using the Pairesd t-test.

Null hypothesis - There is no difference after completing the module. Alternate Hypothesis - There is a difference after completing the module.

Calculate the difference between the two observations i.e. di = yi - xi.

Calculate the mean difference d

Calcualte the standard deviation of the differences, Sd and use this to calculate the standard error of the mean difference, SE(d) = \(\frac{Sd}{\sqrt{n}}\)

Calculate the T value

t = \(\frac{d}{SE(d)}\)

then use a table of t value to look up the p-values for the paired t-test.

Understanding Vision Transformers

Overview

I was taking part in the ISIC 2024 challenge when I got stuck training a ResNet50 model that it started overfitting. My score at this point was 0.142. To be at the the top I had to beat the score 0.188. While scouring the internet for any new model I came across Vision Transformers. I was honestly surprised that transformer architecture could be applied to images. I came across this interesting paper called "An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale"

About the Paper

We know that the Transformer architecture has become the norm for Natural Language Processing(NLP) tasks. Unlike in NLP tasks in conjunction with attention we also have convoluitonal networks. However, this paper demonstrates that convolitonal networks need not be applied and a pure transformer architecture applied on a sequence of image patches can perform image classification tasks really well provided that its pre-trained on large amounts of data.

Basic Theory of Vision Transformers

The Vision Transformer architecture has been inspired by the successes of the Transformer successes in NLP. The first step to create a Vision Transformer is to split an image into patches. We now generate the position of these patches and then generate embeddings for them. Let us consider dimensional tranformation that is taking place here.

Our original image X had the dimenstion HxWxC. Where H is height and W is the width of the images and C is the channel. Since, we are dealing with RGB images the C will be 3.

After fetching the patches, we get the following dimensions NxPxPxC.

Where N is the number of patches in an image.

To calculate it N = \(\frac{H * W}{P*P}\)

Now, we flatten the aforementioned patches and project them via a dense layer to have a dimension D whic his known as the constant latent vector size D. Then we add the patch embeddings and the positional embeddings to retain some of the position information. The postional information is in 1D and not 2D since no performance gain was observed.

This output is forwarded through the some layers of the transformer blocks.

The transformer enocder block is composed of alternating layers of multiheaded self attention and MLP blocks. Layer Norm is applied before every block i.e. before an attention or MLP block and a residual connection is created after every block.

It is to be noted that Vision Transformers have much less inductive bias than CNNs. Inductive biases are assumptions we make about a data set. For example we can assume the marks of students in a given subject to follow a gaussian distribution. CNN architectures inherently will have some biases due to the way they are structured. CNNs are structured to capture the local relationship between the pixels of an image. As CNNS get deeper the local feature extractors help tp extract the global features. In Vit only the MLP layers are local and translationally equivariant while the self attention layers are global. An hybrid version of ViT also exists where CNN is applied to extract the feature maps and then forward to the Transformer encoder block.

Is it better than CNNs?

In the paper ViT can only perform classification tasks and not Segmentation or detection tasks but it still matches or outperforms CNNs and introduces a parallelism with multihead self attention. ViT will only perform better with pre-training and requires more epochs.

Gaussian Distribution Problems in Interviews

Overview

When going for any Machine Learning or Data Science Interviews, the interviewers like to check if a candidate can model a problem after a distribution. For everyone the favourite being the Gaussian Distribution. I'm sure everyone is familiar with how to do this, but to refresh everyones memory on the subject lets look at a question.

Interview Question

On any given day the average customer visiting a store is 500 and the standard deviation is 20. What is the probability that the number of customers on any day is in the range of 480-520.

Solution

So lets note own the details of the problem

We know that the mean \(\mu\) is 500 and \(\sigma\) is 20. Assuming gaussian distribution. Our model will look something like this:

Gaussian Distribution

So the total percentage is 68.26%

How to delete last 5 commits in git?

Overview

In one of my interview questions I was asked if I knew git, and was given a problem to solve with it. I was asked for a give commit how do you delete the last 5 commits. One thing to not in git there are 100 ways to do the same things so there is no right or wrong way. However, at the end of the day the interviewer decides what is right.

Solution

Run the following:

git reset --hard HEAD~5

References

  1. Git - git-reset Documentation. (n.d.). https://git-scm.com/docs/git-reset

Plotting Several graphs with a single line of code using matplotlib

Overview

In one of my interviews I was asked to show how to plot 2 graphs using one line of code. Unfortunately, I wasn't aware that this possibly so I told the interviewer that I knew how to do it in two lines, the interviewer just smiled. But, honestly who remembers such things, given a documentation I'm sure it would have been easy to figure it out, sadly todays interviewing strategies don't really reflect real world skills and abilities.

Code Solution

Multiline Solution

For the multiline I am sure everyone has seen and to me is the cleanest solution of all.

import numpy as np
import matplotlib.pyplot as plt
# mean = 25
# std = 10
input_vals =  np.random.normal(loc=25,scale=10, size=1000)
mean = input_vals.mean()
std = input_vals.std()
transformed_vals = input_vals-mean
transformed_vals = transformed_vals/std
new_mean = 50
new_std = 20
new_input_vals = transformed_vals * new_std
new_input_vals = new_input_vals + new_mean
new_input_vals
array([ 4.97262510e+01,  6.41950072e+01,  3.31727074e+01,  5.56440683e+01,
        3.37928955e+01,  3.73482005e+01,  3.83536527e+01,  6.66796932e+01,
        7.38599396e+01,  7.17467233e+01,  5.73445908e+01,  5.85044549e+01,
        5.62885711e+01,  4.86385627e+01,  3.17002910e+01,  4.62663769e+01,
        3.79672393e+01,  4.55757139e+01,  7.95473957e+01,  8.47314777e+01,
        7.53494888e+01,  3.67249637e+01,  3.48962467e+01,  3.70960149e+01,
        6.77144113e+01,  3.82444243e+01,  5.79397525e+01,  5.64606623e+01,
        6.55255784e+01,  8.48117096e+01,  3.48664339e+01,  6.81253202e+01,
        1.56436383e+01,  2.19698134e+01,  2.06851578e+01,  4.00919529e+01,
        6.45135436e+01,  4.85857230e+01,  3.73739400e+01,  8.18960471e+01,
        6.47992782e+01,  5.45578105e+01,  5.27481693e+01,  7.26556541e+01,
        6.52766267e+01,  5.43891724e+01,  2.85271822e+01,  5.62877495e+01,
        4.43133498e+01,  6.47283461e+01,  5.34390107e+01,  4.66288026e+01,
        6.41413240e+01,  3.40513370e+01,  3.16913603e+00,  8.18762293e+01,
        5.31382668e+01,  6.00845889e+01,  4.34753146e+01,  2.87196168e+01,
        3.87930296e+01,  4.63811304e+01,  5.14380906e+01,  5.09331675e+01,
        4.01047455e+01,  4.98966821e+01,  1.57353218e+01,  4.81616581e+01,
        2.93790637e+01,  4.98499474e+01,  1.56813622e+01,  4.06766703e+01,
        8.78471565e+01,  6.67271167e+01,  6.03161383e+01,  6.40485217e+01,
        2.54927687e+01,  5.21594604e+01,  4.18997972e+01,  2.77864518e+01,
        5.00337907e+01,  2.92284157e+01,  1.97714880e+01,  6.55726845e+01,
        7.71951661e+01,  7.43999517e+01,  4.66773711e+01,  1.18134673e+01,
        5.55068311e+01,  3.21075903e+01,  3.94008005e+01,  5.41726848e+01,
        5.49448257e+01,  6.47877718e+01,  5.30595622e+01,  5.08879683e+01,
        8.43420023e+01,  3.83604247e+01,  1.21843234e+01,  3.10478584e+01,
        3.51744668e+01,  3.82361630e+01,  1.47947592e+01,  3.80039527e+01,
        4.30739053e+01,  5.84298842e+01,  7.38988876e+01,  3.87285719e+01,
        4.92386437e+01,  7.22702249e+01,  5.94864186e+01,  7.36198661e+01,
        6.98013628e+01,  3.23705648e+01,  2.94831496e+01,  4.67371368e+01,
        1.47115827e+01,  3.41017536e+01,  9.25220450e+01,  5.97642121e+01,
        2.23111055e+01,  4.58513839e+01,  4.75114587e+01,  6.09169697e+01,
        8.96741991e+01,  2.74447414e+01,  3.24948919e+01,  8.65286916e+00,
        6.32930859e+01,  7.45680651e+01,  3.59225163e+01,  5.71493098e+01,
        8.94081094e+01,  7.02884584e+01,  5.38496261e+01,  7.07504111e+01,
        3.04737645e+01,  3.72031734e+01,  3.03921084e+01,  5.00790002e+01,
        6.12979878e+01,  6.53701184e+01,  5.77009139e+01,  4.91265147e+01,
        3.68800412e+01,  4.82236334e+01,  4.37972056e+01,  5.56659721e+01,
        2.08395038e+01,  5.50958362e+01,  3.78893588e+01,  5.82563020e+01,
        4.20204618e+01,  4.79336410e+01,  6.52104373e+01,  4.58605376e+01,
        3.96606379e+01,  4.12586878e+01,  4.24242992e+01,  5.41711562e+01,
        6.82626827e+01,  5.01087719e+01,  5.48861417e+01,  5.17354191e+01,
        6.95877431e+01,  7.29643914e+01,  1.59792948e+01,  6.44605155e+01,
        4.36807351e+01,  1.74546933e+01,  2.15064512e+01,  9.94782836e+01,
        1.24698053e+01,  4.40146170e+01,  7.27866215e+01,  1.18408753e+02,
        4.53963544e+01,  3.75532874e+01,  7.47684629e+01,  3.25432418e+01,
        4.82419479e+01,  3.77868317e+01,  4.59511787e+01,  3.01010155e+01,
        8.25522728e+01,  3.13445806e+01,  7.40198978e+01,  4.47147954e+01,
        2.91605249e+01,  7.70967373e+01,  3.28211397e+01,  6.94203122e+01,
        4.62309477e+01,  8.96018151e+01,  4.32844940e+01,  4.44285282e+01,
        4.01668092e+01,  5.59602076e+01,  4.23566550e+01,  3.65968598e+01,
        1.14229872e+01,  3.71482985e+01,  8.01091294e+01,  4.26600040e+01,
        3.83182886e+01,  3.24364629e+01,  6.30436659e+01,  2.76899255e+01,
        2.22386780e+01,  3.64829703e+01,  4.60831499e+01,  1.14579604e+01,
        4.46950083e+01,  1.00299143e+02,  5.74519355e+01,  6.13887878e+01,
        5.90151114e+01,  2.22913660e+01,  4.61747723e+01,  2.81523472e+01,
        4.48447243e+01,  4.87802270e+01,  6.42045734e+01,  3.03457615e+01,
        8.83800053e+01,  4.77675112e+01,  7.25055015e+01,  5.19353632e+01,
        2.40197645e+01,  3.61380929e+01,  7.13512344e+01,  5.21470790e+01,
        7.69906031e+01,  2.75694021e+01,  7.23380117e+01,  1.95939793e+01,
        3.45883787e+01,  4.74206251e+01,  7.34629071e+01,  4.62772912e+01,
        5.27110697e+01,  5.14255840e+01,  3.96448349e+01,  3.92314837e+01,
        5.66604813e+01,  5.81154220e+01,  8.49046159e+01,  5.78898135e+01,
        7.47500642e+01,  5.42751552e+01,  3.17079483e+01,  6.70735969e+01,
        4.26900907e+01,  4.72223307e+01,  5.96336708e+01,  4.73144992e+01,
        5.19295141e+01,  3.78976423e+01,  4.71163070e+01,  4.28080968e+01,
        6.74578499e+01,  3.36385231e+01,  4.56653620e+01,  5.98283524e+01,
        4.75220460e+01,  4.79692643e+01,  8.56734504e+01,  4.23672947e+01,
        3.85615637e+01,  5.94519284e+01,  5.16635569e+01,  5.49745706e+01,
        1.31747054e+01,  6.39176194e+01,  4.05638556e+00,  6.88252478e+01,
        3.80693041e+01,  2.22358353e+01,  3.77655707e+01,  7.08921778e+01,
        5.22594965e+01,  7.28271419e+01,  7.08960460e+01,  4.18381183e+01,
        7.60904140e+01,  7.75712748e+01,  5.25571895e+01,  3.45138621e+01,
        5.90094221e+01,  4.65788325e+01,  5.29888300e+01,  5.85780875e+01,
        5.52777096e+01,  5.39039751e+01,  1.91815228e+01,  2.71299695e+01,
        2.72143762e+01,  5.80603010e+01,  6.56306427e+01,  8.25692629e+01,
        5.05455022e+01,  5.10168834e+01,  7.59587958e+01,  7.07376843e+01,
        7.77485431e+01,  4.76461076e+01,  6.26839262e+01,  7.53709810e+01,
        7.66992457e+01,  7.86190359e+01,  4.11500387e+01,  6.46650079e+01,
        6.75092945e+01,  6.18657855e+01,  3.47069066e+01,  4.90315732e+01,
        5.59901125e+01, -7.13206907e+00,  3.90779801e+01,  5.21050554e+01,
        6.05622883e+01,  5.89984192e+01,  5.11460405e+01,  1.02582766e+02,
        2.89821361e+01,  4.91031112e+01,  4.40159464e+01,  4.74407985e+01,
        5.58102999e+01,  2.93277082e+01,  3.93095491e+01,  5.47872697e+01,
        5.14029703e+01,  4.58367135e+01,  8.30366453e+01,  2.07225809e+01,
        6.31351680e+01,  6.15225721e+01,  5.86669909e+01,  4.18155907e+01,
        5.06449622e+01,  4.14820891e+01,  4.97943141e+01,  3.79254761e+01,
        7.54749511e+01,  6.17251789e+01,  1.42164277e+01,  7.30519857e+01,
        3.42574031e+01,  4.15106984e+01,  6.72139686e+01,  5.70645493e+01,
        4.54665387e+01,  6.39127614e+01,  6.26129839e+01,  7.00442025e+00,
        8.09706085e+01,  7.43156548e+01,  6.12057299e+01,  6.60028997e+01,
        2.18863815e+01,  4.35482627e+01,  5.38520469e+01,  3.95945988e+01,
        7.60195700e+01,  4.76573024e+01,  7.07293677e+01,  4.21540174e+01,
        4.48038890e+01,  4.85922153e+01,  8.53943034e+01,  2.12740534e+01,
        4.51812300e+01,  5.88532344e+01,  5.08892988e+01,  5.83297631e+01,
        5.40651366e+01,  9.53071181e+01,  3.32160564e+01,  6.67088840e+01,
        5.05602708e+01,  2.67525713e+01,  5.66178909e+01,  5.89578580e+01,
        6.31056322e+01,  4.45047275e+01,  7.77652947e+01,  1.65665943e+01,
        9.21281336e+00,  6.70786566e+01,  7.89790898e+01,  5.27143671e+01,
        6.55934057e+01,  3.29242919e+01,  1.37965581e+01,  6.70400724e+01,
        4.92815947e+01,  3.44505910e+01,  5.68040067e+01,  9.37290408e+01,
        6.09953027e+01,  8.38328886e+00,  6.20682087e+01,  4.31342902e+01,
        7.25334904e+01,  5.72517092e+01,  4.76387444e+01,  3.80033722e+01,
        5.04482360e+01,  2.96231687e+01,  1.34192831e-01,  8.87831485e+01,
        1.41383927e+01,  7.32963162e+01,  7.13482698e+01,  5.87271065e+01,
        5.89149905e+01,  5.43743618e+01,  2.33782645e+01,  8.73338154e+01,
        3.89354036e+01,  8.31282293e+01,  5.10178835e+01,  4.68211778e+01,
        6.74195746e+01,  6.86139959e+01,  4.49257539e+01,  7.08247597e+01,
        4.77463767e+01,  4.27463813e+01,  5.48333853e+01,  4.56286212e+01,
        7.37659954e+01,  6.76497431e+01,  7.43349252e+01,  2.74369610e+01,
        2.08726313e+01,  2.93575378e+01,  9.12203968e+01,  3.29439074e+01,
        5.11289798e+01,  4.57862012e+01,  5.23203585e+01,  5.94027517e+01,
        2.42369032e+00,  4.00518867e+01,  7.54936277e+01,  5.07548038e+01,
        8.25171646e+01,  7.36683698e+01,  2.77424516e+01,  4.75824588e+01,
        5.55655132e+01,  5.70535471e+01,  6.67019729e+00,  3.92544149e+01,
        6.10040650e+01,  8.60116283e+01,  2.19124240e+01,  6.61397950e+01,
        7.14626210e+01,  4.09508847e+01,  2.53556091e+01,  8.28124456e+01,
        7.22639930e+01,  6.57439935e+01,  6.33561310e+01,  3.78015732e+01,
        6.72807038e+01,  4.55893441e+01,  3.54098818e+01,  9.14033959e+01,
        5.77696118e+01,  1.73361121e+01,  3.55965579e+01,  4.89605535e+01,
        4.37579688e+01,  2.41240079e+01,  5.63656456e+01,  5.40022076e+01,
        5.89146713e+01,  3.54158996e+01,  3.13192480e+01,  5.09523492e+01,
        7.29536084e+01,  1.17849418e+01,  7.20372819e+01,  4.13907780e+01,
        5.30141806e+01,  4.79106341e+01,  5.01993383e+01,  6.64080819e+01,
        3.69450646e+01,  7.80334238e+01,  1.03843232e+01,  6.06052153e+01,
        8.70960677e+01,  4.01089777e+01,  3.78049049e+01,  2.52034547e+01,
        4.99666275e+01,  1.00503437e+01,  4.53602907e+01,  3.09624754e+01,
       -7.99140676e-02,  4.46377557e+01,  7.64994366e+00,  6.07883609e+01,
        6.70382744e+01,  6.70079346e+01,  1.42662635e+01,  2.58376760e+01,
        4.49155830e+01,  4.38667674e+01,  5.99132865e+01,  9.03995003e+01,
        6.72256253e+01,  4.66440304e+01,  6.42204993e+01,  7.29842018e+01,
        3.63530880e+01,  6.92127579e+01,  2.25541133e+01,  5.60455480e+01,
        3.36509265e+01,  3.53390073e+01,  5.23129106e+01,  6.83742016e+01,
        1.08261980e+01,  5.02824005e+01,  7.69441160e+00,  8.05087956e+01,
        6.22867329e+01,  4.80514310e+01,  8.41161530e+01,  4.37407917e+01,
        3.91530075e+01,  4.12052603e+01,  4.67602607e+01,  5.56884932e+01,
        6.22239594e+01,  2.83791169e+01,  3.24140023e+01,  4.85493183e+01,
        4.93584225e+01,  8.62252700e+00,  2.09977812e+01,  5.67277542e+01,
        5.96250735e+01,  5.74181738e+01,  4.43996392e+01,  4.30414447e+01,
        5.84483853e+01,  7.66382827e+01, -4.85587934e+00,  6.51916200e+01,
        2.88079069e+01,  4.80334253e+01,  1.70666462e+01,  6.43579081e+01,
        8.50358951e+01,  2.75791679e+01,  6.53467623e+01,  1.35252106e+01,
        6.04285460e+01,  3.52335648e+01,  5.66689933e+01,  3.70492809e+01,
        5.12373052e+01,  3.60483097e+01,  4.49377604e+01,  3.62568237e+01,
        7.21569769e+01,  5.45280619e+01,  4.62863317e+01,  2.15721111e+01,
        6.28020262e+01,  7.75276147e+01,  2.58502699e+00,  5.00169680e+01,
        4.75507661e+01,  4.55487987e+01,  3.84815627e+01,  3.28662067e+01,
        5.76735884e+01,  6.23575473e+01,  3.68540728e+01,  4.25834012e+01,
        5.19736971e+01,  3.90329786e+01,  3.72490570e+01,  3.58506327e+01,
        4.19682204e+01,  7.44048682e+01,  4.73775205e+00,  8.33666390e+01,
        5.90984994e+01,  9.94508265e+01,  6.92984708e+01,  1.27482587e+01,
        5.06972841e+01,  3.59086858e+01,  2.13352086e+01,  4.59703349e+01,
        6.02407192e+01,  4.90976392e+01,  2.53531858e+01,  5.54674832e+01,
        4.20822661e+01,  3.49161297e+01,  4.16385198e+01,  3.72166837e+01,
        4.05985426e+01,  1.07587588e+02,  6.88183705e+01,  4.80585968e+01,
        6.50940544e+01,  8.79890694e+01,  1.72163266e+01,  6.27449264e+01,
        7.89349884e+01,  5.36553511e+01,  4.15373357e+01,  1.95405270e+01,
        6.11897996e+01,  5.48447329e+01,  3.96444936e+01,  3.45375731e+01,
        7.39040149e+01,  6.03592172e+01,  2.98847636e+01,  7.33700978e+01,
        4.47181400e+01,  3.04585534e+01,  9.59927625e+00,  8.29373699e+01,
        5.52836836e+01,  9.81207909e+00,  6.86285567e+01,  4.77446949e+01,
        5.76675372e+01,  5.37974812e+01,  5.55379880e+01,  6.02699173e+01,
        6.67370555e+01,  7.54287953e+01,  4.59787102e+01,  4.60474407e+01,
        6.38744624e+01,  4.39079068e+01,  2.05678197e+01,  6.27469865e+01,
        5.79685123e+01,  5.58725664e+01,  4.12867606e+01,  5.60118119e+01,
        3.35720315e+01,  4.50343744e+01,  8.03815346e+01,  1.42108638e+01,
        8.04637036e+01,  6.59676991e+01,  5.45857009e+01,  5.74991413e+01,
        6.82651475e+01,  1.40455660e+01,  4.38444823e+01,  4.81339086e+01,
        3.33897656e+01,  5.91857305e+01,  7.82359957e+01,  5.21267627e+01,
        1.57856710e+01,  4.62251109e+01,  5.87693791e+01,  3.38576994e+01,
        4.21268689e+01,  3.83494154e+01,  3.81119768e+01,  5.45086177e+01,
        5.30471482e+01,  2.03662756e+01,  4.90384811e+01,  5.85900308e+01,
        5.07983431e+01,  4.82232284e+01,  3.57633314e+01,  3.88925452e+01,
        5.19604565e+01,  1.90800527e+01,  6.13642399e+01,  2.99467151e+01,
        4.47324900e+01,  2.45477723e+01,  7.12581742e+01,  6.77801891e+01,
        3.58031105e+01,  4.54614741e+01,  5.70220249e+01,  3.90174595e+01,
        6.02058825e+01,  5.20856895e+01,  4.75049498e+01,  3.86241729e+01,
        6.13581051e+01,  4.19909369e+01,  6.76175905e+01,  7.42983906e+01,
        7.92241492e+01,  5.33594593e+01,  4.63789503e+01,  3.35663333e+01,
        4.01872677e+01,  8.36656134e+01,  8.16212747e+01,  6.93496119e+01,
        4.61663697e+01,  3.73110654e+01,  5.49127377e+01,  4.40223836e+01,
        4.43108572e+01,  6.75982282e+01,  3.48185881e+01,  5.77048872e+01,
        3.43615266e+01,  5.96583647e+01,  3.83303854e+01,  6.44889437e+01,
        8.88984256e+01,  6.05667673e+01,  2.46517283e+01,  5.35936489e+01,
        6.83251874e+01,  7.50465482e+01,  5.35890343e+01,  5.71776979e+01,
        4.71590330e+01,  4.29138167e+01,  3.18721166e+01,  1.78723665e+01,
        2.37579224e+01,  4.70345771e+01,  4.98658045e+01,  7.95162899e+01,
        3.01964307e+01,  3.61605063e+01,  6.48481042e+01,  5.82651006e+01,
        5.67299061e+01,  5.82468645e+01,  5.27659064e+01,  7.80144548e+01,
       -1.81875982e+00,  4.48166231e+01,  2.44844292e+01,  5.00175138e+01,
        4.23655802e+01,  3.42827337e+01,  6.79546054e+01,  9.19132017e+01,
        4.01333503e+01,  5.97016053e+01,  6.53891457e+01,  6.49021729e+01,
        5.10671254e+01,  6.55272499e+01,  5.11000031e+01,  9.88725349e+01,
        3.58597844e+01,  7.74396710e+01,  4.92101330e+01,  5.11284270e+01,
        5.05615107e+01,  8.56510102e+01,  5.44176611e+01,  5.28692453e+01,
        4.48416859e+01,  6.96655859e+01,  6.66747245e+01,  2.58324302e+01,
        4.58193524e+01,  2.12910503e+01,  3.34128789e+01,  7.27412851e+01,
        9.30526713e+01,  7.86137249e+01,  6.24940419e+01,  8.65207000e+01,
        5.41363809e+01,  4.43530597e+01,  4.82412920e+01,  6.18822743e+01,
        1.48342484e+01,  4.09532013e+01,  7.47863553e+01,  6.88263728e+01,
        4.26618101e+01,  4.73684324e+01,  2.55504883e+01,  3.68092107e+01,
        3.05964151e+01,  2.92729438e+01,  7.68143135e+01,  2.00636758e+01,
        3.93172214e+01,  7.12013201e+01,  5.31840226e+01,  4.33205088e+01,
        1.85507408e+01,  6.93950937e+01,  4.08774379e+01,  7.53997880e+01,
        3.33300308e+01,  6.12127659e+01,  7.47711766e+01,  6.59695680e+01,
        6.48252715e+01,  4.51280419e+01,  7.34813321e+01,  9.91452566e+01,
        4.06925312e+01,  4.82945436e+01,  3.85368371e+01,  7.94166760e+01,
        7.61656778e+01,  3.12710381e+01,  6.72961529e+01,  4.13614225e+01,
        2.25621381e+01,  1.98036372e+01,  3.44802122e+01,  6.26084073e+01,
        3.91710105e+01,  3.10526458e+01,  5.10588932e+01,  4.90629575e+01,
        1.57885874e+01,  4.86224457e+01,  3.62146642e+01,  6.32971028e+01,
        5.06452397e+01,  3.06098599e+01,  4.46450667e+01,  1.27694720e+01,
        7.13258569e+01,  3.22340451e+01,  7.69632817e+01,  3.33908566e+01,
        5.80107954e+01,  5.58820102e+01,  3.06086354e+01,  6.17342669e+01,
        5.82069831e+01,  5.44247348e+01,  4.11739065e+01,  4.91914253e+01,
        1.03889391e+02,  4.46818900e+01,  6.39503129e+01,  3.87421151e+01,
        5.37844463e+01,  1.79796161e+01,  6.94401545e+01,  3.99892484e+01,
        7.85746916e+01,  5.62061154e+01,  2.71297470e+01,  2.07495010e+01,
        9.18238208e+01,  4.30828868e+00,  5.89973284e+01,  3.69521887e+01,
        7.77100538e+01,  2.91855571e+01,  4.39590658e+01,  6.59981463e+01,
        1.54102818e+01,  6.17457410e+01,  5.76508387e+01,  6.77781768e+01,
        6.49134137e+01,  3.04790867e+01,  1.97005788e+01,  4.47335344e+01,
        5.73058793e+01,  2.59902769e+01,  6.29930422e+01,  4.40924417e+01,
        4.68671957e+00,  5.33445891e+01,  5.86015222e+01,  2.32235267e+01,
        4.54584938e+01,  1.31758595e+01,  5.65895862e+01,  6.69366623e+01,
        2.76825617e+01,  7.99193476e+00,  4.30404663e+01,  4.23856129e+01,
        6.06713182e+01,  8.35082535e+01,  4.94483630e+01,  3.58838838e+01,
        9.04293255e+01,  5.49940920e+01,  7.93762465e+01,  3.51502094e+01,
        5.74676491e+01,  6.87092366e+01,  3.59306135e+01,  4.86862467e+01,
        6.06140023e+01,  3.95157652e+01,  2.93690746e+01,  2.14781414e+00,
        5.97178956e+01,  6.02090846e+01,  3.97506829e+01,  7.82891107e+01,
        4.25032054e+01,  4.00644889e+01,  3.19230837e+01,  3.78505725e+01,
        7.06475423e+01,  4.31800544e+01,  4.82657650e+01,  8.05206693e+01,
        4.31868963e+01,  4.85046034e+01,  4.21593980e+01,  1.51990681e+01,
        3.53738859e+01,  8.01984118e+01,  7.64656001e+01,  5.45685869e+01,
        5.13807080e+01,  8.29278438e+01,  5.83801566e+01,  5.74714255e+01,
        5.72713761e+01,  5.07381142e+01,  2.18652001e+01,  3.33351646e+01,
        2.02119171e+01,  7.34906719e+01,  7.98721720e+01,  7.94965018e+01,
        1.00811358e+02,  3.71707949e+01,  3.35557355e+01,  3.38923918e+01,
        6.85451700e+01,  9.81232875e+01,  4.84287819e+01,  3.99635350e+01,
        6.26134916e+01,  4.12477016e+01,  7.22482149e+01,  3.51448266e+01,
        6.11000646e+01,  4.51660178e+01,  2.39193305e+01,  3.28749845e+01,
        2.12626130e+01,  8.69176784e+01,  4.13164819e+01,  3.16309606e+01,
        8.75437651e+01,  3.26831039e+01,  6.16947213e+01,  6.29633742e+01,
        3.04670647e+01,  3.60134661e+01,  4.32430529e+01,  3.09289105e+01,
        5.93031665e+01,  2.52868794e+01,  2.20828915e+01,  7.68208071e+01,
        9.80676336e+01,  3.83871338e+01,  3.68649160e+01,  1.76748364e+01,
        1.35316383e+01,  5.03074303e+01,  3.39946035e+01,  5.43990016e+01,
        2.09051370e+01,  6.23807961e+01,  3.91031121e+01,  5.57380667e+01,
        4.49007304e+01,  4.28668359e+01,  5.74135957e+01,  7.79245845e+01,
        3.05567507e+01,  5.01540564e+01,  3.92652109e+01,  6.05587289e+01])
new_input_vals.mean()
50.0
new_input_vals.std()
20.0
plt.plot(np.arange(0,1000),input_vals,"r--")
plt.plot(np.arange(0,1000),new_input_vals,"g--")
[<matplotlib.lines.Line2D at 0x7f9e64f3c650>]

png




One line Solution

import numpy as np
import matplotlib.pyplot as plt
# mean = 25
# std = 10
input_vals =  np.random.normal(loc=25,scale=10, size=1000)
mean = input_vals.mean()
std = input_vals.std()
transformed_vals = input_vals-mean
transformed_vals = transformed_vals/std
new_mean = 50
new_std = 20
new_input_vals = transformed_vals * new_std
new_input_vals = new_input_vals + new_mean
new_input_vals
array([ 4.97262510e+01,  6.41950072e+01,  3.31727074e+01,  5.56440683e+01,
        3.37928955e+01,  3.73482005e+01,  3.83536527e+01,  6.66796932e+01,
        7.38599396e+01,  7.17467233e+01,  5.73445908e+01,  5.85044549e+01,
        5.62885711e+01,  4.86385627e+01,  3.17002910e+01,  4.62663769e+01,
        3.79672393e+01,  4.55757139e+01,  7.95473957e+01,  8.47314777e+01,
        7.53494888e+01,  3.67249637e+01,  3.48962467e+01,  3.70960149e+01,
        6.77144113e+01,  3.82444243e+01,  5.79397525e+01,  5.64606623e+01,
        6.55255784e+01,  8.48117096e+01,  3.48664339e+01,  6.81253202e+01,
        1.56436383e+01,  2.19698134e+01,  2.06851578e+01,  4.00919529e+01,
        6.45135436e+01,  4.85857230e+01,  3.73739400e+01,  8.18960471e+01,
        6.47992782e+01,  5.45578105e+01,  5.27481693e+01,  7.26556541e+01,
        6.52766267e+01,  5.43891724e+01,  2.85271822e+01,  5.62877495e+01,
        4.43133498e+01,  6.47283461e+01,  5.34390107e+01,  4.66288026e+01,
        6.41413240e+01,  3.40513370e+01,  3.16913603e+00,  8.18762293e+01,
        5.31382668e+01,  6.00845889e+01,  4.34753146e+01,  2.87196168e+01,
        3.87930296e+01,  4.63811304e+01,  5.14380906e+01,  5.09331675e+01,
        4.01047455e+01,  4.98966821e+01,  1.57353218e+01,  4.81616581e+01,
        2.93790637e+01,  4.98499474e+01,  1.56813622e+01,  4.06766703e+01,
        8.78471565e+01,  6.67271167e+01,  6.03161383e+01,  6.40485217e+01,
        2.54927687e+01,  5.21594604e+01,  4.18997972e+01,  2.77864518e+01,
        5.00337907e+01,  2.92284157e+01,  1.97714880e+01,  6.55726845e+01,
        7.71951661e+01,  7.43999517e+01,  4.66773711e+01,  1.18134673e+01,
        5.55068311e+01,  3.21075903e+01,  3.94008005e+01,  5.41726848e+01,
        5.49448257e+01,  6.47877718e+01,  5.30595622e+01,  5.08879683e+01,
        8.43420023e+01,  3.83604247e+01,  1.21843234e+01,  3.10478584e+01,
        3.51744668e+01,  3.82361630e+01,  1.47947592e+01,  3.80039527e+01,
        4.30739053e+01,  5.84298842e+01,  7.38988876e+01,  3.87285719e+01,
        4.92386437e+01,  7.22702249e+01,  5.94864186e+01,  7.36198661e+01,
        6.98013628e+01,  3.23705648e+01,  2.94831496e+01,  4.67371368e+01,
        1.47115827e+01,  3.41017536e+01,  9.25220450e+01,  5.97642121e+01,
        2.23111055e+01,  4.58513839e+01,  4.75114587e+01,  6.09169697e+01,
        8.96741991e+01,  2.74447414e+01,  3.24948919e+01,  8.65286916e+00,
        6.32930859e+01,  7.45680651e+01,  3.59225163e+01,  5.71493098e+01,
        8.94081094e+01,  7.02884584e+01,  5.38496261e+01,  7.07504111e+01,
        3.04737645e+01,  3.72031734e+01,  3.03921084e+01,  5.00790002e+01,
        6.12979878e+01,  6.53701184e+01,  5.77009139e+01,  4.91265147e+01,
        3.68800412e+01,  4.82236334e+01,  4.37972056e+01,  5.56659721e+01,
        2.08395038e+01,  5.50958362e+01,  3.78893588e+01,  5.82563020e+01,
        4.20204618e+01,  4.79336410e+01,  6.52104373e+01,  4.58605376e+01,
        3.96606379e+01,  4.12586878e+01,  4.24242992e+01,  5.41711562e+01,
        6.82626827e+01,  5.01087719e+01,  5.48861417e+01,  5.17354191e+01,
        6.95877431e+01,  7.29643914e+01,  1.59792948e+01,  6.44605155e+01,
        4.36807351e+01,  1.74546933e+01,  2.15064512e+01,  9.94782836e+01,
        1.24698053e+01,  4.40146170e+01,  7.27866215e+01,  1.18408753e+02,
        4.53963544e+01,  3.75532874e+01,  7.47684629e+01,  3.25432418e+01,
        4.82419479e+01,  3.77868317e+01,  4.59511787e+01,  3.01010155e+01,
        8.25522728e+01,  3.13445806e+01,  7.40198978e+01,  4.47147954e+01,
        2.91605249e+01,  7.70967373e+01,  3.28211397e+01,  6.94203122e+01,
        4.62309477e+01,  8.96018151e+01,  4.32844940e+01,  4.44285282e+01,
        4.01668092e+01,  5.59602076e+01,  4.23566550e+01,  3.65968598e+01,
        1.14229872e+01,  3.71482985e+01,  8.01091294e+01,  4.26600040e+01,
        3.83182886e+01,  3.24364629e+01,  6.30436659e+01,  2.76899255e+01,
        2.22386780e+01,  3.64829703e+01,  4.60831499e+01,  1.14579604e+01,
        4.46950083e+01,  1.00299143e+02,  5.74519355e+01,  6.13887878e+01,
        5.90151114e+01,  2.22913660e+01,  4.61747723e+01,  2.81523472e+01,
        4.48447243e+01,  4.87802270e+01,  6.42045734e+01,  3.03457615e+01,
        8.83800053e+01,  4.77675112e+01,  7.25055015e+01,  5.19353632e+01,
        2.40197645e+01,  3.61380929e+01,  7.13512344e+01,  5.21470790e+01,
        7.69906031e+01,  2.75694021e+01,  7.23380117e+01,  1.95939793e+01,
        3.45883787e+01,  4.74206251e+01,  7.34629071e+01,  4.62772912e+01,
        5.27110697e+01,  5.14255840e+01,  3.96448349e+01,  3.92314837e+01,
        5.66604813e+01,  5.81154220e+01,  8.49046159e+01,  5.78898135e+01,
        7.47500642e+01,  5.42751552e+01,  3.17079483e+01,  6.70735969e+01,
        4.26900907e+01,  4.72223307e+01,  5.96336708e+01,  4.73144992e+01,
        5.19295141e+01,  3.78976423e+01,  4.71163070e+01,  4.28080968e+01,
        6.74578499e+01,  3.36385231e+01,  4.56653620e+01,  5.98283524e+01,
        4.75220460e+01,  4.79692643e+01,  8.56734504e+01,  4.23672947e+01,
        3.85615637e+01,  5.94519284e+01,  5.16635569e+01,  5.49745706e+01,
        1.31747054e+01,  6.39176194e+01,  4.05638556e+00,  6.88252478e+01,
        3.80693041e+01,  2.22358353e+01,  3.77655707e+01,  7.08921778e+01,
        5.22594965e+01,  7.28271419e+01,  7.08960460e+01,  4.18381183e+01,
        7.60904140e+01,  7.75712748e+01,  5.25571895e+01,  3.45138621e+01,
        5.90094221e+01,  4.65788325e+01,  5.29888300e+01,  5.85780875e+01,
        5.52777096e+01,  5.39039751e+01,  1.91815228e+01,  2.71299695e+01,
        2.72143762e+01,  5.80603010e+01,  6.56306427e+01,  8.25692629e+01,
        5.05455022e+01,  5.10168834e+01,  7.59587958e+01,  7.07376843e+01,
        7.77485431e+01,  4.76461076e+01,  6.26839262e+01,  7.53709810e+01,
        7.66992457e+01,  7.86190359e+01,  4.11500387e+01,  6.46650079e+01,
        6.75092945e+01,  6.18657855e+01,  3.47069066e+01,  4.90315732e+01,
        5.59901125e+01, -7.13206907e+00,  3.90779801e+01,  5.21050554e+01,
        6.05622883e+01,  5.89984192e+01,  5.11460405e+01,  1.02582766e+02,
        2.89821361e+01,  4.91031112e+01,  4.40159464e+01,  4.74407985e+01,
        5.58102999e+01,  2.93277082e+01,  3.93095491e+01,  5.47872697e+01,
        5.14029703e+01,  4.58367135e+01,  8.30366453e+01,  2.07225809e+01,
        6.31351680e+01,  6.15225721e+01,  5.86669909e+01,  4.18155907e+01,
        5.06449622e+01,  4.14820891e+01,  4.97943141e+01,  3.79254761e+01,
        7.54749511e+01,  6.17251789e+01,  1.42164277e+01,  7.30519857e+01,
        3.42574031e+01,  4.15106984e+01,  6.72139686e+01,  5.70645493e+01,
        4.54665387e+01,  6.39127614e+01,  6.26129839e+01,  7.00442025e+00,
        8.09706085e+01,  7.43156548e+01,  6.12057299e+01,  6.60028997e+01,
        2.18863815e+01,  4.35482627e+01,  5.38520469e+01,  3.95945988e+01,
        7.60195700e+01,  4.76573024e+01,  7.07293677e+01,  4.21540174e+01,
        4.48038890e+01,  4.85922153e+01,  8.53943034e+01,  2.12740534e+01,
        4.51812300e+01,  5.88532344e+01,  5.08892988e+01,  5.83297631e+01,
        5.40651366e+01,  9.53071181e+01,  3.32160564e+01,  6.67088840e+01,
        5.05602708e+01,  2.67525713e+01,  5.66178909e+01,  5.89578580e+01,
        6.31056322e+01,  4.45047275e+01,  7.77652947e+01,  1.65665943e+01,
        9.21281336e+00,  6.70786566e+01,  7.89790898e+01,  5.27143671e+01,
        6.55934057e+01,  3.29242919e+01,  1.37965581e+01,  6.70400724e+01,
        4.92815947e+01,  3.44505910e+01,  5.68040067e+01,  9.37290408e+01,
        6.09953027e+01,  8.38328886e+00,  6.20682087e+01,  4.31342902e+01,
        7.25334904e+01,  5.72517092e+01,  4.76387444e+01,  3.80033722e+01,
        5.04482360e+01,  2.96231687e+01,  1.34192831e-01,  8.87831485e+01,
        1.41383927e+01,  7.32963162e+01,  7.13482698e+01,  5.87271065e+01,
        5.89149905e+01,  5.43743618e+01,  2.33782645e+01,  8.73338154e+01,
        3.89354036e+01,  8.31282293e+01,  5.10178835e+01,  4.68211778e+01,
        6.74195746e+01,  6.86139959e+01,  4.49257539e+01,  7.08247597e+01,
        4.77463767e+01,  4.27463813e+01,  5.48333853e+01,  4.56286212e+01,
        7.37659954e+01,  6.76497431e+01,  7.43349252e+01,  2.74369610e+01,
        2.08726313e+01,  2.93575378e+01,  9.12203968e+01,  3.29439074e+01,
        5.11289798e+01,  4.57862012e+01,  5.23203585e+01,  5.94027517e+01,
        2.42369032e+00,  4.00518867e+01,  7.54936277e+01,  5.07548038e+01,
        8.25171646e+01,  7.36683698e+01,  2.77424516e+01,  4.75824588e+01,
        5.55655132e+01,  5.70535471e+01,  6.67019729e+00,  3.92544149e+01,
        6.10040650e+01,  8.60116283e+01,  2.19124240e+01,  6.61397950e+01,
        7.14626210e+01,  4.09508847e+01,  2.53556091e+01,  8.28124456e+01,
        7.22639930e+01,  6.57439935e+01,  6.33561310e+01,  3.78015732e+01,
        6.72807038e+01,  4.55893441e+01,  3.54098818e+01,  9.14033959e+01,
        5.77696118e+01,  1.73361121e+01,  3.55965579e+01,  4.89605535e+01,
        4.37579688e+01,  2.41240079e+01,  5.63656456e+01,  5.40022076e+01,
        5.89146713e+01,  3.54158996e+01,  3.13192480e+01,  5.09523492e+01,
        7.29536084e+01,  1.17849418e+01,  7.20372819e+01,  4.13907780e+01,
        5.30141806e+01,  4.79106341e+01,  5.01993383e+01,  6.64080819e+01,
        3.69450646e+01,  7.80334238e+01,  1.03843232e+01,  6.06052153e+01,
        8.70960677e+01,  4.01089777e+01,  3.78049049e+01,  2.52034547e+01,
        4.99666275e+01,  1.00503437e+01,  4.53602907e+01,  3.09624754e+01,
       -7.99140676e-02,  4.46377557e+01,  7.64994366e+00,  6.07883609e+01,
        6.70382744e+01,  6.70079346e+01,  1.42662635e+01,  2.58376760e+01,
        4.49155830e+01,  4.38667674e+01,  5.99132865e+01,  9.03995003e+01,
        6.72256253e+01,  4.66440304e+01,  6.42204993e+01,  7.29842018e+01,
        3.63530880e+01,  6.92127579e+01,  2.25541133e+01,  5.60455480e+01,
        3.36509265e+01,  3.53390073e+01,  5.23129106e+01,  6.83742016e+01,
        1.08261980e+01,  5.02824005e+01,  7.69441160e+00,  8.05087956e+01,
        6.22867329e+01,  4.80514310e+01,  8.41161530e+01,  4.37407917e+01,
        3.91530075e+01,  4.12052603e+01,  4.67602607e+01,  5.56884932e+01,
        6.22239594e+01,  2.83791169e+01,  3.24140023e+01,  4.85493183e+01,
        4.93584225e+01,  8.62252700e+00,  2.09977812e+01,  5.67277542e+01,
        5.96250735e+01,  5.74181738e+01,  4.43996392e+01,  4.30414447e+01,
        5.84483853e+01,  7.66382827e+01, -4.85587934e+00,  6.51916200e+01,
        2.88079069e+01,  4.80334253e+01,  1.70666462e+01,  6.43579081e+01,
        8.50358951e+01,  2.75791679e+01,  6.53467623e+01,  1.35252106e+01,
        6.04285460e+01,  3.52335648e+01,  5.66689933e+01,  3.70492809e+01,
        5.12373052e+01,  3.60483097e+01,  4.49377604e+01,  3.62568237e+01,
        7.21569769e+01,  5.45280619e+01,  4.62863317e+01,  2.15721111e+01,
        6.28020262e+01,  7.75276147e+01,  2.58502699e+00,  5.00169680e+01,
        4.75507661e+01,  4.55487987e+01,  3.84815627e+01,  3.28662067e+01,
        5.76735884e+01,  6.23575473e+01,  3.68540728e+01,  4.25834012e+01,
        5.19736971e+01,  3.90329786e+01,  3.72490570e+01,  3.58506327e+01,
        4.19682204e+01,  7.44048682e+01,  4.73775205e+00,  8.33666390e+01,
        5.90984994e+01,  9.94508265e+01,  6.92984708e+01,  1.27482587e+01,
        5.06972841e+01,  3.59086858e+01,  2.13352086e+01,  4.59703349e+01,
        6.02407192e+01,  4.90976392e+01,  2.53531858e+01,  5.54674832e+01,
        4.20822661e+01,  3.49161297e+01,  4.16385198e+01,  3.72166837e+01,
        4.05985426e+01,  1.07587588e+02,  6.88183705e+01,  4.80585968e+01,
        6.50940544e+01,  8.79890694e+01,  1.72163266e+01,  6.27449264e+01,
        7.89349884e+01,  5.36553511e+01,  4.15373357e+01,  1.95405270e+01,
        6.11897996e+01,  5.48447329e+01,  3.96444936e+01,  3.45375731e+01,
        7.39040149e+01,  6.03592172e+01,  2.98847636e+01,  7.33700978e+01,
        4.47181400e+01,  3.04585534e+01,  9.59927625e+00,  8.29373699e+01,
        5.52836836e+01,  9.81207909e+00,  6.86285567e+01,  4.77446949e+01,
        5.76675372e+01,  5.37974812e+01,  5.55379880e+01,  6.02699173e+01,
        6.67370555e+01,  7.54287953e+01,  4.59787102e+01,  4.60474407e+01,
        6.38744624e+01,  4.39079068e+01,  2.05678197e+01,  6.27469865e+01,
        5.79685123e+01,  5.58725664e+01,  4.12867606e+01,  5.60118119e+01,
        3.35720315e+01,  4.50343744e+01,  8.03815346e+01,  1.42108638e+01,
        8.04637036e+01,  6.59676991e+01,  5.45857009e+01,  5.74991413e+01,
        6.82651475e+01,  1.40455660e+01,  4.38444823e+01,  4.81339086e+01,
        3.33897656e+01,  5.91857305e+01,  7.82359957e+01,  5.21267627e+01,
        1.57856710e+01,  4.62251109e+01,  5.87693791e+01,  3.38576994e+01,
        4.21268689e+01,  3.83494154e+01,  3.81119768e+01,  5.45086177e+01,
        5.30471482e+01,  2.03662756e+01,  4.90384811e+01,  5.85900308e+01,
        5.07983431e+01,  4.82232284e+01,  3.57633314e+01,  3.88925452e+01,
        5.19604565e+01,  1.90800527e+01,  6.13642399e+01,  2.99467151e+01,
        4.47324900e+01,  2.45477723e+01,  7.12581742e+01,  6.77801891e+01,
        3.58031105e+01,  4.54614741e+01,  5.70220249e+01,  3.90174595e+01,
        6.02058825e+01,  5.20856895e+01,  4.75049498e+01,  3.86241729e+01,
        6.13581051e+01,  4.19909369e+01,  6.76175905e+01,  7.42983906e+01,
        7.92241492e+01,  5.33594593e+01,  4.63789503e+01,  3.35663333e+01,
        4.01872677e+01,  8.36656134e+01,  8.16212747e+01,  6.93496119e+01,
        4.61663697e+01,  3.73110654e+01,  5.49127377e+01,  4.40223836e+01,
        4.43108572e+01,  6.75982282e+01,  3.48185881e+01,  5.77048872e+01,
        3.43615266e+01,  5.96583647e+01,  3.83303854e+01,  6.44889437e+01,
        8.88984256e+01,  6.05667673e+01,  2.46517283e+01,  5.35936489e+01,
        6.83251874e+01,  7.50465482e+01,  5.35890343e+01,  5.71776979e+01,
        4.71590330e+01,  4.29138167e+01,  3.18721166e+01,  1.78723665e+01,
        2.37579224e+01,  4.70345771e+01,  4.98658045e+01,  7.95162899e+01,
        3.01964307e+01,  3.61605063e+01,  6.48481042e+01,  5.82651006e+01,
        5.67299061e+01,  5.82468645e+01,  5.27659064e+01,  7.80144548e+01,
       -1.81875982e+00,  4.48166231e+01,  2.44844292e+01,  5.00175138e+01,
        4.23655802e+01,  3.42827337e+01,  6.79546054e+01,  9.19132017e+01,
        4.01333503e+01,  5.97016053e+01,  6.53891457e+01,  6.49021729e+01,
        5.10671254e+01,  6.55272499e+01,  5.11000031e+01,  9.88725349e+01,
        3.58597844e+01,  7.74396710e+01,  4.92101330e+01,  5.11284270e+01,
        5.05615107e+01,  8.56510102e+01,  5.44176611e+01,  5.28692453e+01,
        4.48416859e+01,  6.96655859e+01,  6.66747245e+01,  2.58324302e+01,
        4.58193524e+01,  2.12910503e+01,  3.34128789e+01,  7.27412851e+01,
        9.30526713e+01,  7.86137249e+01,  6.24940419e+01,  8.65207000e+01,
        5.41363809e+01,  4.43530597e+01,  4.82412920e+01,  6.18822743e+01,
        1.48342484e+01,  4.09532013e+01,  7.47863553e+01,  6.88263728e+01,
        4.26618101e+01,  4.73684324e+01,  2.55504883e+01,  3.68092107e+01,
        3.05964151e+01,  2.92729438e+01,  7.68143135e+01,  2.00636758e+01,
        3.93172214e+01,  7.12013201e+01,  5.31840226e+01,  4.33205088e+01,
        1.85507408e+01,  6.93950937e+01,  4.08774379e+01,  7.53997880e+01,
        3.33300308e+01,  6.12127659e+01,  7.47711766e+01,  6.59695680e+01,
        6.48252715e+01,  4.51280419e+01,  7.34813321e+01,  9.91452566e+01,
        4.06925312e+01,  4.82945436e+01,  3.85368371e+01,  7.94166760e+01,
        7.61656778e+01,  3.12710381e+01,  6.72961529e+01,  4.13614225e+01,
        2.25621381e+01,  1.98036372e+01,  3.44802122e+01,  6.26084073e+01,
        3.91710105e+01,  3.10526458e+01,  5.10588932e+01,  4.90629575e+01,
        1.57885874e+01,  4.86224457e+01,  3.62146642e+01,  6.32971028e+01,
        5.06452397e+01,  3.06098599e+01,  4.46450667e+01,  1.27694720e+01,
        7.13258569e+01,  3.22340451e+01,  7.69632817e+01,  3.33908566e+01,
        5.80107954e+01,  5.58820102e+01,  3.06086354e+01,  6.17342669e+01,
        5.82069831e+01,  5.44247348e+01,  4.11739065e+01,  4.91914253e+01,
        1.03889391e+02,  4.46818900e+01,  6.39503129e+01,  3.87421151e+01,
        5.37844463e+01,  1.79796161e+01,  6.94401545e+01,  3.99892484e+01,
        7.85746916e+01,  5.62061154e+01,  2.71297470e+01,  2.07495010e+01,
        9.18238208e+01,  4.30828868e+00,  5.89973284e+01,  3.69521887e+01,
        7.77100538e+01,  2.91855571e+01,  4.39590658e+01,  6.59981463e+01,
        1.54102818e+01,  6.17457410e+01,  5.76508387e+01,  6.77781768e+01,
        6.49134137e+01,  3.04790867e+01,  1.97005788e+01,  4.47335344e+01,
        5.73058793e+01,  2.59902769e+01,  6.29930422e+01,  4.40924417e+01,
        4.68671957e+00,  5.33445891e+01,  5.86015222e+01,  2.32235267e+01,
        4.54584938e+01,  1.31758595e+01,  5.65895862e+01,  6.69366623e+01,
        2.76825617e+01,  7.99193476e+00,  4.30404663e+01,  4.23856129e+01,
        6.06713182e+01,  8.35082535e+01,  4.94483630e+01,  3.58838838e+01,
        9.04293255e+01,  5.49940920e+01,  7.93762465e+01,  3.51502094e+01,
        5.74676491e+01,  6.87092366e+01,  3.59306135e+01,  4.86862467e+01,
        6.06140023e+01,  3.95157652e+01,  2.93690746e+01,  2.14781414e+00,
        5.97178956e+01,  6.02090846e+01,  3.97506829e+01,  7.82891107e+01,
        4.25032054e+01,  4.00644889e+01,  3.19230837e+01,  3.78505725e+01,
        7.06475423e+01,  4.31800544e+01,  4.82657650e+01,  8.05206693e+01,
        4.31868963e+01,  4.85046034e+01,  4.21593980e+01,  1.51990681e+01,
        3.53738859e+01,  8.01984118e+01,  7.64656001e+01,  5.45685869e+01,
        5.13807080e+01,  8.29278438e+01,  5.83801566e+01,  5.74714255e+01,
        5.72713761e+01,  5.07381142e+01,  2.18652001e+01,  3.33351646e+01,
        2.02119171e+01,  7.34906719e+01,  7.98721720e+01,  7.94965018e+01,
        1.00811358e+02,  3.71707949e+01,  3.35557355e+01,  3.38923918e+01,
        6.85451700e+01,  9.81232875e+01,  4.84287819e+01,  3.99635350e+01,
        6.26134916e+01,  4.12477016e+01,  7.22482149e+01,  3.51448266e+01,
        6.11000646e+01,  4.51660178e+01,  2.39193305e+01,  3.28749845e+01,
        2.12626130e+01,  8.69176784e+01,  4.13164819e+01,  3.16309606e+01,
        8.75437651e+01,  3.26831039e+01,  6.16947213e+01,  6.29633742e+01,
        3.04670647e+01,  3.60134661e+01,  4.32430529e+01,  3.09289105e+01,
        5.93031665e+01,  2.52868794e+01,  2.20828915e+01,  7.68208071e+01,
        9.80676336e+01,  3.83871338e+01,  3.68649160e+01,  1.76748364e+01,
        1.35316383e+01,  5.03074303e+01,  3.39946035e+01,  5.43990016e+01,
        2.09051370e+01,  6.23807961e+01,  3.91031121e+01,  5.57380667e+01,
        4.49007304e+01,  4.28668359e+01,  5.74135957e+01,  7.79245845e+01,
        3.05567507e+01,  5.01540564e+01,  3.92652109e+01,  6.05587289e+01])
new_input_vals.mean()
50.0
new_input_vals.std()
20.0
plt.plot(np.arange(0,1000),input_vals,"r--",np.arange(0,1000),new_input_vals,"g--")
[<matplotlib.lines.Line2D at 0x7f9e64fb5150>,
 <matplotlib.lines.Line2D at 0x7f9e64fb5250>]

png