How did I graphically visualize percentage consumption per category without importing python plotting libraries like matplotlib?
A few days back, I took a challenge from freecodecamp for creating a budget app. The The challenge was to write a python class to print budget/account details and a function to visualize the percentage spent per category of budget. How did I visualize this without importing python plotting libraries like matplotlib?
Since this is a challenge to gain certification, I’m not going to present my code like I do in my web scraping articles. I’ll explain just how I figured out the solution.
There were some beautiful events in this scary year, some undesirable too.
I have been a teacher for about three years. It was my first ever class that I started with, and I was a planning a farewell for them. Being pioneer of my experience, this class was my 1st love. There was another, my 2nd love. Both these classes were so important for me that I couldn’t expect them going away. But yeah, it had to be.
My 1st class left me and I gave them 2 farewells(I never wanted to miss them, ’cause remembering and not finding hurts)…
In this approach, I have not used BeautifulSoup and Pandas libraries. This script scrapes the menus from the stores at any location of choice and saves the data as a JSON file. The JSON file contains the main header of “Menu”, some sub-headers with the names of the “food stores”, and inner nodes of “food categories” in each store. Then, each category contains the product name as the “Key”, and its price as the “Value”…
In part 1, I scraped doordash.com for restaurants’ menus with a different approach. Here is my second approach.
Summary of the 1st Approach
The scraper navigated to the URL(doordash.com), clicked on a food item(this opened a page of food stores in New York), clicked on each store, scraped the menus, returned to the stores’ page, and repeated this loop until it scraped 10,000–10,050 menus. If needed, it clicked the next page button.
This time I wanted to be able to enter a manual search string for a location with the script name, in the terminal. And, for that, I…
Since I started automating browsers for web scraping, scrolling the page was a challenge for me. I searched over the internet and found many solutions. One such solution was
where ‘x’ is the position over the x-axis and ‘y’ is the position over the y-axis.
But, how would I know these coordinates?
To solve the problem I found another answer
where ‘x’ = 0, and ‘y’= full height of the document.
I decided to put some range for page size and run the scrolling code inside a loop i.e.
for i in range(0, 7000, 200): browser.execute_script(f"window.scrollTo(x…
I have been a teacher for about 4 years. Just two months ago, I left the field unintentionally due to some compulsion. All I earned was love, compassion, and honor.
This story is about that beautiful experience.
When I started as a teacher, I was a noob among all the teachers there. All the experienced teachers advised me to show sternness that I couldn’t. I was told that students are not true to anyone. And, I used to think that why should they be! …
This is a part of a series about Dynamic Web Scraping. And, this is
This story contains an introduction to dynamic websites and my first approach towards its scraping. Let’s begin with the introduction to dynamic websites.
Dynamic websites produce some results based on some action of a user. For example, when a webpage is completely loaded only on scroll down or move the mouse over the screen there must be some dynamic programming behind this. When you hover the mouse pointer over some text and it gives you some options, it also contains some dynamics. One…
A freelance web scraper, enthusiast data scientist, and an independent Bioinformatics researcher