NLP For WhatsApp Chats

Natural Language Processing on WhatsApp Chats
Python Machine Learning NLP
NLP For WhatsApp Chats

Project Overview

Natural Language Processing or NLP is a field of Artificial Intelligence which focuses on enabling the systems for understanding and processing the human languages. In this article, I will use NLP to analyze my WhatsApp Chats. For some privacy reasons, I will use Person 1, Person 2 and so on in my WhatsApp Chats.

Project Details

Get The Whatsapp Data for NLP

If you have never exported your whatsapp chats before, don’t worry it’s very easy. For NLP of WhatsApp chats, you need to extract the whatsapp chats from your smartphone. You just need to open any chat in your whatsapp then select the export chat option. The text file you will get as a return will look like this:

["[02/07/2017, 5:47:33 pm] Person_1: Hey there! This is the first message",

"[02/07/2017, 5:48:24 pm] Person_1: This is the second message",

"[02/07/2017, 5:48:44 pm] Person_1: Third…",

"[02/07/2017, 8:10:52 pm] Person_2: Hey Person_1! This is the fourth message",

"[02/07/2017, 8:14:11 pm] Person_2: Fifth …etc"]

I will use two different approaches for the NLP of WhatsApp Chats. First, by focusing on the fundamentals of NLP and the other is by using the datetime stamp at the starting of every conversation.

Formatting Whatsapp Chats for NLP

To analyze our whatsapp conversations, initially, our conversation needs to be formatted in the form of data. This involved a few basic steps in achieving the formation of data by creating a dictionary, constructed within two keys with each of the respective values with a list of the person tokenized conversations.

ppl=defaultdict(list) for line in content:

    try:

        person = line.split(':')[2][7:]

        text = nltk.sent_tokenize(':'.join(line.split(':')[3:]))

        ppl[person].extend(text) # If key exists (person), extend list with value (text),

                                                # if not create a new key, with value added to list

    except:

        print(line) # in case reading a line fails, examine why pass

ppl = {'Person_1' : ['This is message 1', 'Another message',

'Hi Person_2', ... , 'My last tokenised message in the chat'] ,

'Person_2':['Hello Person_1!', 'How's it going?', 'Another messsage', ...]} 

Classification of Dialogues

The classification of tokenized conversations will ne be achieved by training a Naive Bayes Classification model or the training set with some pre-categorized chat styles conversations:

Our trained model can be tested by using a test set or even by user input. Our model is trained in a way that can classify any tokenized sentence into different categories like Greetings, Statements, Emotions, questions, etc.


 

classifier.classify(extract_features('Hi there!'))

 

‘Greet’

Now let’s run the model on WhatsApp data for counting the occurrences of each category of the tokenized conversations:

ax = df.T.plot(kind='bar', figsize=(10, 7),

legend=True, fontsize=16, color=['y','g'])

ax.set_title("Frequency of Message Categories", fontsize= 18)

ax.set_xlabel("Message Category", fontsize=14)

ax.set_ylabel("Frequency", fontsize=14) #plt.savefig('plots/cat_message') # uncomment to save plt.show()

NLP for Whatsapp

NLP for WhatsApp Chats Emotions

We all use emojis, everyone, not only on WhatsApp but with any other chatting platform. Now let’s see what emojis are being used in most of the conversations.

Person_1's emojis:
 ๐Ÿ˜๐Ÿ•บ๐Ÿผ๐Ÿป๐Ÿ˜ฎ๐Ÿคค๐Ÿ˜ญ๐Ÿ˜๐Ÿ’๐Ÿผ๐Ÿ˜๐Ÿ‘๐Ÿ™๐Ÿณ๐Ÿ‹๐Ÿ˜๐Ÿ˜ฑ๐Ÿ™„๐Ÿ˜ณโ˜บ๐Ÿ˜ญ๐Ÿš€๐Ÿ’ซโญโœจ๐Ÿ’ฅ๐Ÿ•๐Ÿ•๐Ÿ˜๐Ÿ˜Š๐Ÿ˜˜๐Ÿ™„๐Ÿ’ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜โœ…๐Ÿ˜ฑ๐Ÿ˜๐Ÿ˜ญ๐Ÿ™„๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿธ๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜…๐Ÿ˜˜๐Ÿ˜ญ๐Ÿ‘๐Ÿ’ช๐Ÿ˜ญ๐Ÿ™…โ™‚๐Ÿ™†โ™‚๐Ÿ™‹โ™‚๐Ÿ’โ™‚๐Ÿ˜˜๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐Ÿ˜Š๐Ÿ˜˜๐Ÿ™„๐Ÿ˜ด๐Ÿ˜‰๐Ÿ•บ๐Ÿผ๐Ÿ˜ญ๐Ÿ˜Ž๐Ÿ˜ญ๐Ÿ™„๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ‘๐Ÿ˜ฉ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ“ž๐ŸŽ‰๐Ÿ˜˜๐Ÿ˜€๐Ÿ˜š๐Ÿ˜ฑ๐Ÿ‘๐Ÿ๐Ÿ˜๐Ÿš‚๐Ÿค“๐Ÿ‘๐Ÿ™„๐Ÿ™Œ๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ™Œ๐Ÿ˜๐Ÿ˜”๐Ÿ˜ญ๐Ÿ˜˜๐Ÿคฐ๐Ÿผ๐Ÿ˜˜๐Ÿ™„๐Ÿ™„๐Ÿ˜ฐ๐Ÿ™‹๐Ÿผโ™€๐Ÿ˜ญ๐Ÿ™„๐Ÿ˜๐Ÿค“๐Ÿ‘๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜˜๐Ÿ•๐Ÿ’ฉโ˜น๐Ÿ™‹๐Ÿผโ™€๐Ÿ˜˜๐Ÿ˜ด๐Ÿšฒ๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜ญโ˜น๐Ÿ˜—๐Ÿ˜™๐Ÿ˜š๐Ÿ˜š๐Ÿค”๐Ÿค๐Ÿป๐ŸŽ‚โœˆ๐Ÿ˜˜๐Ÿ‘Œ๐Ÿ˜ฐ๐Ÿ˜˜๐Ÿ”บ๐Ÿ”ฅ๐Ÿ˜ฉ๐Ÿ˜˜๐Ÿ’จ๐Ÿ˜š๐Ÿ˜ฑ๐Ÿ˜ข๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜—๐Ÿค”๐Ÿค”๐Ÿค”๐Ÿค”๐Ÿค”๐Ÿค”๐Ÿค”๐Ÿ‘€๐Ÿ‘๐Ÿ˜‡๐Ÿ˜—๐Ÿ˜š๐Ÿ˜˜๐Ÿ™„โ˜น๐Ÿ˜˜๐Ÿ˜ฉ๐Ÿ˜š๐Ÿ˜‡โšก๐Ÿ’ฅ๐Ÿ”ฅโ˜น๐Ÿ˜ญ๐Ÿ˜ฉ๐Ÿ˜ญ๐Ÿ˜ฐ๐Ÿ˜ฑ๐Ÿ˜…๐Ÿ˜…๐Ÿ˜๐Ÿ˜ž๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜Š๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜๐Ÿ˜˜๐Ÿ™„๐Ÿ˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ™„๐Ÿ˜˜๐Ÿ‘€๐Ÿ˜˜๐Ÿ˜˜๐Ÿ‘€๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿฅ•๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜ญ๐Ÿ˜˜๐Ÿ˜˜๐Ÿ–•๐Ÿป๐Ÿ˜˜๐ŸŒ‡๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ™„๐Ÿ˜ช๐Ÿคง๐Ÿ˜˜๐Ÿฅš๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜ฑ๐Ÿ˜˜๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜˜๐Ÿ†˜โŒโ€ผโญ•โ™จ๐Ÿšซโ›”๐Ÿšท๐Ÿ–๐Ÿ“Œ๐Ÿ“โœ‚๐Ÿ“•๐Ÿ“ฎ๐Ÿ”ปโ˜Žโฐ๐Ÿšจ๐Ÿš’๐Ÿš—๐ŸฅŠ๐Ÿ“๐Ÿท๐ŸŒถ๐Ÿ…๐ŸŽโ˜„๐ŸŒน๐ŸŽ’๐Ÿ‘ โ›‘๐Ÿ˜Ž๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜™๐Ÿ‘€๐Ÿ™„๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿฅš๐Ÿ˜˜๐Ÿ™„๐Ÿ™„๐Ÿ˜˜ 

Most common: [('๐Ÿ˜˜', 77), ('๐Ÿ˜ญ', 68), ('๐Ÿ™„', 16), ('๐Ÿ‘', 13), ('๐Ÿ˜', 11), ('๐ŸŽ‰', 10), ('๐Ÿค”', 8), ('๐Ÿผ', 6), ('๐Ÿ˜ฑ', 6), ('๐Ÿ˜š', 6)]


Person_2's emojis:
 ๐Ÿ˜๐Ÿ™‚๐Ÿค“๐Ÿ˜…๐Ÿ˜€๐Ÿ‘๐Ÿ˜‚๐Ÿ˜ฌ๐Ÿ‘ป๐Ÿ˜๐Ÿ˜‚โœŒ๐Ÿ˜ด๐Ÿ˜ฌ๐Ÿ˜ฌ๐Ÿ™„๐ŸŽ‰โœŒ๐Ÿ˜‚๐Ÿ˜ช๐Ÿ˜’๐Ÿ˜ฌ๐Ÿ˜๐Ÿ˜ฌ๐Ÿ˜๐Ÿ˜ฌ๐Ÿ˜๐Ÿ˜๐Ÿคข๐Ÿ˜๐Ÿ˜’๐Ÿ˜๐Ÿ˜๐Ÿ˜˜๐Ÿ˜’๐Ÿ˜…๐Ÿ˜‚๐Ÿ’ช๐Ÿ‘Š๐Ÿ˜ฌ๐Ÿ˜๐Ÿ’โ™‚๐Ÿ˜ด๐Ÿ˜ฌ๐Ÿ˜…๐Ÿ˜๐Ÿ˜†๐Ÿฌ๐Ÿ™๐Ÿ˜ฌ๐Ÿฌ๐Ÿ˜๐Ÿ˜โœŒ๐Ÿ˜๐Ÿ˜๐Ÿ‘Š๐Ÿ‘ฎ๐Ÿ˜•โœŒ๐Ÿ˜๐Ÿ˜๐Ÿ˜โœŒ๐Ÿ˜ฑ๐Ÿ˜ฉ๐Ÿ˜ฌโœŒโœŒ๐Ÿ˜‚๐Ÿ˜˜๐Ÿ’‡โ™‚๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜…๐Ÿ™‚๐Ÿ˜ฌ๐Ÿ™๐Ÿ˜๐Ÿ˜๐Ÿ˜•๐Ÿ˜ด๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜˜๐Ÿ˜…๐Ÿ˜ด๐Ÿ™‚๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐Ÿ˜๐Ÿš€๐Ÿš€๐Ÿš€๐Ÿ˜๐Ÿ˜ฑโœŒ๐Ÿ•๐Ÿ•๐Ÿ˜๐Ÿ‘๐Ÿ˜‚๐Ÿ˜๐Ÿ˜‘๐Ÿ˜˜๐Ÿ™„๐Ÿ˜๐Ÿ˜˜๐Ÿ˜ฌ๐Ÿ˜‚๐Ÿ˜๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰โœŒโ˜บ๐Ÿ˜‘๐Ÿ˜๐Ÿ˜ฌ๐Ÿ™‚๐Ÿ˜ฑ๐Ÿ˜‚โœŒโ˜บ๐Ÿ˜๐Ÿ‘Š๐Ÿ˜๐Ÿ‘Š๐Ÿ‘๐Ÿ˜๐Ÿ’๐Ÿผ๐Ÿ˜…๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜•โœŒ๐Ÿค“๐Ÿ˜‚๐Ÿ˜˜๐Ÿ˜๐Ÿ˜โœŒโœŒ๐Ÿ˜˜๐Ÿ™๐Ÿ˜˜๐Ÿ˜๐ŸŽ‰โœŒ๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜…๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜‚๐Ÿ™๐Ÿ˜๐Ÿ˜”โœŒ๐Ÿ˜˜๐Ÿ˜๐Ÿ˜๐Ÿ˜โœŒ๐Ÿ™‚๐Ÿ‘๐Ÿ˜˜๐Ÿ˜ฌ๐Ÿ˜โœŒ๐Ÿ˜‚๐Ÿ™‹๐Ÿผ๐Ÿ˜Ž๐Ÿ˜๐Ÿค“๐Ÿ’ฉ๐Ÿ˜‚๐Ÿ˜˜๐Ÿ˜๐Ÿ˜โœŒ๐Ÿ™‚โœŒ๐Ÿ˜˜โœŒ๐Ÿ˜๐Ÿค”โœŒ๐Ÿ‹๐Ÿผโ™€๐Ÿ˜ฌ๐Ÿ™‚๐Ÿ˜๐Ÿ‘Š๐Ÿ˜โœŒ๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿคœ๐Ÿค›โ˜นโšก๐Ÿ˜ฌ๐ŸŽฏ๐Ÿ’ช๐Ÿ˜โ˜น๐Ÿ˜ž๐Ÿ‘‹๐Ÿ™‚๐Ÿ˜˜๐Ÿ˜ด๐Ÿ˜๐Ÿ˜๐ŸŽ‰๐Ÿ˜โœŒ๐Ÿ™‚๐Ÿ˜˜๐Ÿ˜ฌโœŒ๐Ÿ‘๐Ÿ˜๐Ÿ’ƒ๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ˜ขโ˜น๐Ÿ™๐Ÿ™๐Ÿ‘‹๐Ÿ˜๐Ÿ˜ฌ๐Ÿ˜โœŒ๐Ÿ˜˜๐Ÿ™๐Ÿ‘๐Ÿ™Œ๐Ÿค“๐Ÿ˜๐ŸŽ‰๐Ÿ’โ™‚๐Ÿ˜๐Ÿ˜‘๐Ÿ˜๐Ÿ˜๐Ÿ˜๐ŸŽ‰๐Ÿ˜โ˜น๐Ÿ˜•๐Ÿ˜ข๐Ÿ˜ฌโœŒ๐Ÿ˜ž๐Ÿ˜ฌโœŒ๐Ÿ˜ฌ๐Ÿ‘๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ‘๐Ÿ‘๐Ÿ‘Š๐Ÿ˜๐Ÿ˜ง๐Ÿ˜˜๐Ÿ˜ช๐Ÿ˜๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰๐Ÿ˜•๐Ÿ‘๐Ÿ˜๐Ÿ‘‰๐Ÿ˜๐Ÿ‘Š๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿคณ๐Ÿ‘Œ๐Ÿ˜๐Ÿ‘Œ๐Ÿ™‹๐Ÿผโ™€๐Ÿ‘‹๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ™๐Ÿ˜•๐Ÿ‘Š๐Ÿ˜๐Ÿค”๐Ÿค—๐Ÿค™๐Ÿ‘๐Ÿ˜ฌ๐Ÿค”๐ŸŽ‰๐ŸŽ…๐Ÿป๐Ÿ‘๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿคš๐Ÿ˜˜๐Ÿคš๐Ÿ‘๐Ÿ‘Š๐Ÿ™๐Ÿ™๐Ÿ™๐Ÿ™„๐Ÿ˜˜๐Ÿ™‹๐Ÿผโ™€๐Ÿคฃ๐Ÿ˜˜๐ŸŽ‰๐Ÿ˜ฌ๐Ÿ™๐Ÿ˜–๐Ÿ’โ™‚๐Ÿ˜‚๐Ÿ˜’๐ŸŽ‰๐Ÿ˜—๐Ÿ‘๐Ÿค”๐Ÿค๐Ÿ™„๐Ÿ‘Š๐Ÿ˜˜๐Ÿ˜‰๐Ÿ˜˜๐Ÿ™‚โ˜น๐Ÿ’ฐ๐Ÿ˜๐ŸŽ‰๐Ÿ˜‘๐Ÿ˜ฌ๐Ÿ‘๐Ÿ‘๐Ÿ‘Ž๐Ÿ™‹โ™‚๐Ÿ’โ™‚๐Ÿ˜๐Ÿ˜๐Ÿ™‚โ˜น๐Ÿค”๐Ÿฆ„๐Ÿฆ„๐Ÿ˜ฌ๐Ÿ˜†๐Ÿ˜ด๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ„โ™€๐Ÿ‘€๐Ÿ˜๐Ÿ„โ™€๐Ÿ‘๐Ÿ˜ฌ๐Ÿ‘Š๐Ÿ˜ฌ๐Ÿค”๐Ÿ˜๐Ÿ™„๐Ÿ‘Œ๐Ÿ‘๐Ÿ˜ซโ˜น๐Ÿค—๐Ÿ˜ฉ๐Ÿ‘€๐Ÿ˜๐Ÿ’ฐ๐Ÿค”๐Ÿ‘๐Ÿ˜๐Ÿ˜ฐ๐Ÿ˜ณ๐Ÿ˜ฃ๐Ÿ˜Ÿ๐Ÿ˜˜๐Ÿ‘€๐Ÿค—๐Ÿ™‚๐Ÿ˜…๐Ÿ‘๐Ÿค”๐Ÿ™‚๐Ÿ˜๐Ÿ˜๐Ÿ˜ฃ๐Ÿ•บ๐Ÿ˜ฎ๐Ÿ™‚โ˜นโ˜น๐Ÿ˜‘๐Ÿค˜โ˜น๐Ÿ˜ฌ๐Ÿณ๐Ÿ˜˜๐Ÿ˜ฌ๐Ÿ˜˜๐Ÿค˜๐Ÿ™‹โ™‚๐Ÿ™๐Ÿ“๐Ÿ˜ข๐Ÿ˜๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜๐Ÿ˜˜๐Ÿ‘๐Ÿ˜š๐Ÿ˜š๐Ÿ˜š๐Ÿคž๐Ÿ˜๐Ÿ™„๐Ÿ˜๐Ÿ™‹โ™‚๐Ÿ˜ด๐Ÿ˜˜๐Ÿ‘๐Ÿ˜๐Ÿ‘Š๐Ÿ˜‘๐Ÿ˜’๐Ÿ‘๐Ÿ˜‘๐Ÿ˜ฌ๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ˜•โ˜น๐Ÿ˜Ÿ๐Ÿ’‡โ™€๐Ÿ‘๐ŸŽ‰๐Ÿ˜๐Ÿ˜๐Ÿ˜š๐Ÿค”๐Ÿ‘๐Ÿ‘๐Ÿ˜๐Ÿ˜๐Ÿ‘๐Ÿ˜๐Ÿ˜š๐Ÿ˜๐ŸŽ‰๐Ÿ˜ฌ๐Ÿ™‚๐Ÿ˜ฌ๐Ÿ˜๐Ÿ”ฅ๐Ÿคโ˜น๐Ÿ™Œ๐Ÿ˜๐Ÿ’โ™‚๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ™๐Ÿ˜ญ๐Ÿ™‚๐Ÿ˜ฌ๐Ÿ˜˜๐Ÿ™‚๐Ÿ˜๐Ÿ˜ฌ๐Ÿ‘โ˜บ๐Ÿ™๐Ÿ˜‚๐Ÿ‘€๐Ÿ‘Œ๐Ÿ™Œ๐Ÿ˜๐Ÿ’๐Ÿผโ™€๐Ÿ˜๐Ÿ˜ฌ๐Ÿ‘๐Ÿ˜•๐Ÿ™‚๐Ÿ˜—๐Ÿ˜๐Ÿ˜•๐Ÿ™๐Ÿ‘€๐Ÿ˜๐Ÿ‘๐ŸŽ‰๐Ÿ˜ฉ๐Ÿ˜•๐Ÿ™๐Ÿ˜Š๐Ÿ˜ด๐Ÿคž๐Ÿ˜š๐Ÿ˜ฉ๐Ÿ˜ฉ๐Ÿ˜ฉ๐Ÿ˜๐Ÿ˜ฌ๐Ÿ‘๐Ÿ‘๐Ÿ˜ฌ๐Ÿ˜š๐Ÿ˜๐Ÿ˜ฑ๐Ÿ‘ป๐Ÿ‘ฝ๐Ÿ˜‘๐Ÿ˜๐Ÿ˜ด๐Ÿค’๐Ÿ˜๐Ÿ™๐Ÿ‘Š๐Ÿค“โ˜น๐Ÿ˜๐Ÿค™๐Ÿ˜๐Ÿ‘ฝ๐Ÿ‘Š๐Ÿ˜Š๐Ÿค™๐Ÿ˜โ˜น๐Ÿ™„๐Ÿ˜‡๐Ÿ™‚๐Ÿ˜๐Ÿ˜ฉโ˜น๐Ÿ˜š๐Ÿ˜๐Ÿ‘๐Ÿ™๐Ÿ‘‹๐Ÿ˜Ÿ๐Ÿ˜โ˜น๐Ÿ˜š๐Ÿค”๐Ÿ˜ง๐Ÿ™โ˜น๐Ÿ™ƒ๐Ÿ™‚๐Ÿ‘‹๐Ÿ™‚๐Ÿ‘๐Ÿ‘๐Ÿ˜๐Ÿค™๐Ÿ‘๐Ÿ’ฐ๐Ÿ™‚๐Ÿ˜ข๐Ÿค™๐Ÿ’ฐ๐Ÿ˜š๐Ÿ‘๐Ÿค”๐Ÿคฃ๐Ÿคฃ๐Ÿคฃ๐ŸŽ‰๐Ÿ˜ข๐Ÿ˜๐Ÿ˜ฌ๐Ÿค“๐Ÿ‘Š๐Ÿ’โ™‚๐Ÿ˜๐Ÿ˜๐Ÿ˜๐Ÿ‘๐Ÿ”ฅ๐Ÿค™๐Ÿ˜๐Ÿ‘‰๐Ÿ˜—๐Ÿ˜โšก๐Ÿ’†โ™€โšก๐Ÿ‘๐Ÿ˜š๐Ÿ˜˜๐Ÿค”โ˜น๐Ÿค๐Ÿ˜ข๐Ÿ˜ณ๐Ÿ˜ณ๐Ÿ˜‰๐Ÿ‘โ˜บ๐Ÿ‘Šโ˜นโšกโšกโšกโ˜นโ˜นโ˜น๐Ÿ‘โ˜น๐Ÿ˜š๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ˜ข๐Ÿ’ฐ๐Ÿ˜๐Ÿ˜ฌ๐Ÿ‘Š๐Ÿค”๐Ÿ‘ป๐Ÿ™Œ๐Ÿ’๐Ÿผโ™€๐Ÿ˜’๐Ÿ˜ซ๐Ÿ‘๐Ÿ‘Š๐Ÿ˜‡๐Ÿ™‚๐Ÿค”๐Ÿค™โ˜น๐Ÿ˜ช๐Ÿ˜‰๐Ÿ‘๐Ÿ˜๐Ÿ’ช๐Ÿ˜ญ๐Ÿ˜๐Ÿ’ฉ๐Ÿคค๐Ÿ˜šโ˜นโ˜น๐Ÿ‘Š๐Ÿค™๐Ÿ˜š๐Ÿ˜˜๐Ÿ™๐Ÿคฅ๐Ÿ˜๐Ÿ‘๐Ÿ‘๐Ÿ˜š๐Ÿค—๐Ÿ˜๐Ÿ™„๐Ÿ™„๐Ÿ˜๐Ÿ‘๐Ÿ˜๐Ÿ˜ฏ๐Ÿ˜š๐Ÿ‘๐Ÿ™„๐Ÿ™Œ๐Ÿค”๐Ÿ˜๐Ÿ˜˜๐Ÿ‘๐Ÿ‘Š๐Ÿ˜ฑ๐Ÿ˜๐Ÿ‘๐Ÿ˜˜๐Ÿ˜๐ŸŽ‰๐Ÿ˜ญ๐Ÿ˜๐Ÿ˜š๐Ÿ˜˜๐Ÿ˜ด๐Ÿ‘๐Ÿ˜๐Ÿค”๐Ÿค”๐Ÿ˜๐Ÿคข๐Ÿ˜˜๐Ÿ˜ญ๐Ÿ˜ญ๐Ÿ˜š๐Ÿ˜ฌ๐Ÿ‘๐Ÿ˜˜๐Ÿ‘Š๐Ÿ‘Œ๐Ÿ˜˜๐Ÿ˜๐Ÿ˜๐Ÿ˜š๐Ÿ‘‹๐Ÿ˜โœ‹โ˜๐Ÿ˜ญ๐Ÿค”๐Ÿ‘๐Ÿ˜˜๐Ÿค™๐Ÿ’๐Ÿผโ™€๐Ÿ˜˜๐Ÿ˜˜๐Ÿ‘๐Ÿ‘€๐Ÿ‘‹๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ™๐Ÿ™๐Ÿ‘๐Ÿ˜˜๐Ÿ˜๐Ÿ˜š๐Ÿ‘Š๐Ÿ‘๐Ÿ˜ฌ๐Ÿ‘๐Ÿ‘๐ŸŽ‰๐Ÿ‘๐Ÿ˜‹๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜โ˜น๐Ÿ˜˜๐Ÿ˜๐Ÿ‘๐Ÿ˜๐Ÿค™๐Ÿ‘๐Ÿ‘๐Ÿ˜š๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ‘๐Ÿ’๐Ÿผโ™€๐Ÿ‘๐Ÿ˜˜๐Ÿ˜๐Ÿค”๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ˜˜๐Ÿ‘๐Ÿ˜๐Ÿ˜˜๐Ÿ‘Š๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ‘โ˜น๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ‘๐Ÿ˜˜๐Ÿ‘๐Ÿ˜ด๐Ÿค™๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜•๐Ÿ‘Š๐Ÿ‘๐Ÿ‘๐Ÿ˜๐Ÿ˜˜๐Ÿ˜š๐Ÿ‘†๐Ÿ’โ™€๐Ÿ˜ด๐Ÿ˜˜๐Ÿ‘Š๐Ÿ˜ฅ๐Ÿ‘Š๐Ÿ‘๐Ÿ˜…๐Ÿ™‚๐Ÿ‘Š๐Ÿค™๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜ฒ๐Ÿ˜˜๐Ÿ‘๐Ÿค”๐Ÿ˜ซ๐Ÿคฃ๐Ÿณ๐Ÿ˜Ž๐Ÿ˜š๐Ÿ˜ข๐Ÿ˜ฏ๐Ÿ’ƒ๐Ÿ‘๐Ÿ™„๐Ÿ‘๐Ÿ‘๐Ÿ’‡โ™‚๐Ÿ‘Š๐Ÿ˜š๐Ÿ˜š๐Ÿ˜˜๐Ÿ‘๐Ÿ™„๐Ÿ˜˜๐Ÿ˜š๐Ÿ˜˜๐Ÿ˜ข๐Ÿ›Ž๐Ÿ˜š๐Ÿ™๐Ÿ˜‚๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ‘Œ๐Ÿ‘๐Ÿคทโ™‚๐Ÿ˜‚๐Ÿ‘๐Ÿ˜•๐Ÿ‘๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ‘๐Ÿ‘Š๐Ÿ˜…๐Ÿ˜‰๐Ÿ’ค๐Ÿ‘๐Ÿ˜๐Ÿ˜š๐Ÿ‘๐Ÿค™๐Ÿค“๐Ÿค—๐Ÿ˜˜๐Ÿ˜๐Ÿ’ƒ๐Ÿ˜๐Ÿ˜˜๐Ÿ˜˜๐Ÿ˜ฌ๐Ÿ’โ™‚๐Ÿ˜‚โ˜น๐Ÿ˜๐Ÿ‘๐Ÿ˜˜ 

Most common: [('๐Ÿ˜', 138), ('๐Ÿ˜˜', 103), ('๐Ÿ‘', 91), ('๐Ÿ˜ฌ', 42), ('๐Ÿ‘Š', 29), ('โ˜น', 29), ('๐Ÿ˜š', 28), ('โœŒ', 27), ('๐Ÿ˜', 25), ('๐Ÿ™‚', 24)]

 

Sentiment Against Time

The plotting of sentiments against the datetime is not as easy as it looks. As there are many different sentiments on the same day, so the first step is to calculate the mean sentiment for each day and then grouping by datetime. So let’s see how we can do this:

rolling mean

Frequency of Chats

Now let’s have a look at the frequency of whatsapp chats which is not a part of NLP for Whatsapp but it is a part of time series analysis. We can use time series here to see the frequency of chats. First, need to create a colour pallete ordered by the total number of messages for each day.

Project Information

Category: python d-s ml-ai-nlp
Completed: 2023

Technologies Used

  • Python
  • Machine Learning
  • NLP

Share This Project

Looking for AI/ML expertise?

I specialize in AI and machine learning solutions that deliver measurable business impact. Whether you need predictive modeling, digital twin frameworks, or intelligent automation, I can help transform your data into valuable insights. To discuss your project, drop me an email at sachinmoze@gmail.com or use the contact form.