x0rz / tweets_analyzer Goto Github PK
View Code? Open in Web Editor NEWTweets metadata scraper & activity analyzer
License: GNU General Public License v3.0
Tweets metadata scraper & activity analyzer
License: GNU General Public License v3.0
With Python 3.10 (on Debian testing), I have the error Error: module 'collections' has no attribute 'Iterable'
. Indeed, it seems that collections.Iterable
is deprecated since Python 3.9.
After some debugging, the issue is in print_charts
function.
@x0rz
Error message: [!] Error: 'latin-1' codec can't encode characters in position 7-29: ordinal not in range(256)
avi@sin:~/github/tweets_analyzer# python3 tweets_analyzer.py -n AviHD --utc-offset 19800 --friends
[+] Getting @AviHD account data...
[+] lang : en
[+] geo_enabled : False
[+] time_zone : New Delhi
[+] utc_offset : 19800
[!] Applying timezone offset 19800 (--utc-offset)
[+] statuses_count : 199
[+] Retrieving last 199 tweets...
97%|################################################################################################################### | 194/199 [00:06<00:00, 28.94tw/s]
[+] Downloaded 199 tweets from 2011-12-15 05:35:51 to 2017-10-09 06:11:59 (2125 days)
[+] Average number of tweets per day: 0.1
Daily activity distribution (per hour)
###############################################################################
[!] Error: 'latin-1' codec can't encode characters in position 7-29: ordinal not in range(256)
My python version 3.5.2 commands
$ python tweets_analyzer.py -n @screename --friends
[+] Getting @screenname account data...
[!] Twitter error: [{'message': 'Invalid or expired token.', 'code': 89}]
Hi,
I am fairly new to this. When I run the program I continually get the following error message:
Twitter error: [{u'message' : u'Bad Authentication data.' , u 'code' : 215}]
What could be the culprit?
Thanks
--Problem Solved--
root@Linux1:~/tweets_analyzer# python tweets_analyzer.py --name Snowden
[+] Getting @snowden account data...
[+] lang : en
[+] geo_enabled : False
[+] time_zone : Eastern Time (US & Canada)
[+] utc_offset : -14400
[+] statuses_count : 2130
[+] Retrieving last 1000 tweets...
100%|####################################################################################################################################################################################################| 1000/1000 [00:08<00:00, 119.15tw/s]
[+] Downloaded 1000 tweets from 2016-09-01 18:10:12 to 2017-06-06 22:35:51 (278 days)
[+] Average number of tweets per day: 3.6
Daily activity distribution (per hour)
###############################################################################
0 00:00 (-)
0 01:00 (-)
0 02:00 (-)
0 03:00 (-)
1 04:00 (-)
0 05:00 (-)
2 06:00 (-)
[!] Error: 'ascii' codec can't encode characters in position 7-8: ordinal not in range(128)
Followed the install and verified ascii_graph is installed:
asciigraph -h
Usage: asciigraph [-l ] [-f file] [-s inc|dec]
[-c] [-t [-T ]
[-w ] [-m ] [-H] [-M cs|si]
<<<
But every time the tweet_analyzer is run
File "./tweets_analyzer.py", line 22, in
from ascii_graph import Pyasciigraph
ModuleNotFoundError: No module named 'ascii_graph'
I just switched to the new Mint OS and when I try to reinstall this, the pip install fails:
$ pip install -r requirements.txt
Collecting ascii-graph==1.5.1 (from -r requirements.txt (line 1))
Downloading https://files.pythonhosted.org/packages/ab/57/90a36a9377d72cfc09a433019182030daf9bdd64db97c5808867a6ddbc57/ascii_graph-1.5.1.tar.gz
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "", line 1, in
ImportError: No module named setuptools
Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-8m9h1A/ascii-graph/
Any ideas?
The tqdm error below was emitted when trying to run tweets_analyzer on OpenBSD6.3 using python2.7:
Error: 'NoneType' object is not callable Exception AttributeError: "'tqdm' object has no attribute 'disable'" in <object repr() failed> ignored
p.s: i commented out numpy in requirements.txt before doing pip install since 1) my currently installed version of numpy module is higher than version specified in the requirements.txt 2) trying to install the version numpy specified in requirements.txt has previously failed, breaking other apps thatsthat requires it.
Traceback (most recent call last):
File "tweets_analyzer.py", line 26, in
import tweepy
ImportError: No module named tweepy
The above error is appearing when i try to run the command python tweets_analyzer.py -n
Some people tweet less, but like lots of stuff. Likes can be used to determine at what hours of day accounts are active.
Hey,
I think it would be interesting to have another list with a "top 5 most liked/favourited users" to fully see relationships between accounts.
Sadly I have no idea about how to implement this 😢
It may sound stupid to try to analyze tweets from an user who never tweeted, still it should fail nicely.
$ tweets_analyzer -n <SomeSilentUser>
[+] Getting @SomeSilentUser account data...
[+] lang : fr
[+] geo_enabled : False
[+] time_zone : None
[+] utc_offset : None
[!] Can't get specific timezone for this user
[+] statuses_count : 0
[+] Retrieving last 0 tweets...
0tw [00:00, ?tw/s]
[!] Error: 'int' object has no attribute 'days'
Hello, I was wondering whether is possible to analyse multiple users with a single run.
Thanks for the great script
Moved to Linux and transferred the files I had on my Mac setup and installed the requirements successfully
But when I ran received this error message
Exception KeyError: KeyError(<weakref at 0x7ff9973b9e68; to 'tqdm' at 0x7ff99c7d3490>,) in <bound method tqdm.del of 0%|
I regenerated the Consumer and Access key and tokens and tried again and got a similar error
Exception KeyError: KeyError(<weakref at 0x7effb10f3e10; to 'tqdm' at 0x7effb6526750>,) in <bound method tqdm.del of 0%| | 0/30 [00:00<?, ?tw/s]> ignored
Any thoughts on what is going on and how to fix this?
Thanks,
After downloading user tweets, the script crashes while converting for graphs
"Daily activity distribution (per hour)
###############################################################################
[!] Error: 'ascii' codec can't encode characters in position 7-8: ordinal not in range(128)"
I tracked it down to daily and weekly distribution activity charts printing.
Both are crashing, so commenting the daily distribution activity does not work as the weekly one crashes too.
./tweets_analyzer.py -n @RealDonaldTrump -l 40000
Stop analyse at 3500.
OSX 10.11.6
Python 2.7.12
Install builds well until numpy. I have version 1.19.4 installed. Error results in 10 pages of error code. I have no idea what to even provide to you as it seems a bit crazy. Is this project still active?
Does the average number of tweets per day count include retweets? If so, would you consider adding an option to ignore retweets in calculating this average (perhaps with the --no-retweets flag)? Thanks!
Quand je lance le script avec l'option --friends, j'ai cette erreur :
[!] Error: object of type 'NoneType' has no len()
Apparemment, ça se produit dans print_stats() avec l'argument friends_lang, au moment d'attribuer une valeur à max_len_key.
Twitter change their API to new level API v1.1, and discontinued the support of XML, Atom, and RSS and instead they’re now supporting JSON only
Hi,
Can you help me I have this issue...
During the installation I have this message maybe it's link...
Using legacy setup.py install for ascii-graph, since package 'wheel' is not installed.
Using legacy setup.py install for numpy, since package 'wheel' is not installed.
Using legacy setup.py install for PySocks, since package 'wheel' is not installed.
ERROR: spacy 2.2.4 has requirement tqdm<5.0.0,>=4.38.0, but you'll have tqdm 4.25.0 which is incompatible.
ERROR: socialscan 1.1.3 has requirement tqdm>=4.31.0, but you'll have tqdm 4.25.0 which is incompatible.
Installing collected packages: ascii-graph, certifi, idna, numpy, oauthlib, PySocks, urllib3, requests, requests-oauthlib, six, tqdm, tweepy
Regards
This one is shit talking too ;)
I get this error, suggesting I use --limit, and the average given for daily posts looks to be roughly double what it should be.
Traceback (most recent call last):
File "tweets_analyzer.py", line 25, in
from tqdm import tqdm
ModuleNotFoundError: No module named 'tqdm'
Please advice . This is after running pip install -r requirements.txt
Hi! Are you planning on working to support the V2 Api?
hi need help to get api key it's hard to create an api apps
Does this script guess location of the Twitter Target?
I have found that I got errors
Python 3.x is required unless you deal with these two warnings.
SNIMissingWarning -- https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning -- https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
This is on a fully updated Linux Mint system.
Precision-WorkStation-390 ~ # uname -a
Linux Precision-WorkStation-390 3.19.0-32-generic #37~14.04.1-Ubuntu SMP Thu Oct 22 09:41:40 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
Precision-WorkStation-390 ~ # python --version
Python 2.7.6
Can this be put on a server so that i can get a GET response?
I'm not sure where this error originated from as I have not used tweets_analyzer in the past. I will say that it returns this error every time I try to run from the CLI.
Since Twint does not require an API to work, is it possible to update this repo to use Twint?
This might sound like a tall order, but unless we are dealing with sensitive accounts this is a good improvement,
[!] Twitter error: [{u'message': u'SSL is required', u'code': 92}]
tweets-analyzer -n TwitterDev
[+] Getting @twitterdev account data...
[!] Twitter error: [{u'message': u'SSL is required', u'code': 92}]
Bonjour,
Peux-tu ajouter une fonction d'exportation, json, html, txt, peu importe.
Cordialement
I am trying to do request on users/lookup route of Twitter API using params of Email/Phone. In the result I am getting error
Could not authenticate you. in response
While using the same credentials when i do request for screen_name, user_id lookup I am getting perfect Response. Not only that all other api requests like home_timeline, followers_ids, friends_ids, etc are working fine with same credentials but on that specific request, I am getting this error. My Request is:
import requests
url = "https://api.twitter.com/1.1/users/lookup.json?map=true&phone=1234567890"
payload = {}
headers = {
'Authorization': 'OAuth realm="http%3A%2F%2Fapi.twitter.com",oauth_consumer_key="<consumer_key>",oauth_token="<outh_token>",oauth_signature_method="HMAC-SHA1",oauth_timestamp="1595318479",oauth_nonce="<nonce>",oauth_version="1.0",oauth_signature="<sig>"'
}
response = requests.request("GET", url, headers=headers, data = payload)
print(response.text.encode('utf8'))
In Response I am getting this Error:
{
"errors": [
{
"code": 32,
"message": "Could not authenticate you."
}
]
}
What are the reasons and suggestions on this ?
Will appreciate the positive feedback.
Hi, how are you ?
I am using your script and I get this error after executing this command:
"python2 tweets_analyzer.py -n someUSERNAME".
The error is:
[!] Twitter error: [{u'message': u'Invalid or expired token.', u'code': 89}].
how to overcome this issue ?
Thanks in advance.
This is so great! Hi How can I export the results?
Possible de retirer les xxxx de secrets.py ?
C'est un peu "relou" lorsque l'on doit renseigner ses clés :P
I'd like to see a distribution of tweets over years. Like, did I wasted too much time on Twitter while I was a student, and suddenly dropped it all after I was into crack smoking - or got pregnant.
Hi all,
Hope you are all well !
Is there an easy way to screenshot like in the readme the results of the analyzer into a PNG ?
It would be great as it could combined with tools like botometer and publish a tweet with a summary like
or https://twitter.com/StattoBot/status/1108268411539345408
Some refs using botometer API:
Some refs to osint twitter search:
Cheers
Invalid token error while analysis of tweets using tweet analyser. How to fix this error?
The --no-retweets flag doesn't seem to be working anymore. While it did in the past, now whenever I run tweets_analyzer, it returns the same average number of tweets per day regardless of whether or not I use the flag.
why is this happening, i have done that in "secret.py"
Is it possible to run an analysis (including colored ascii graph) by loading a prior exported/saved -e export_file
or a -s json_file
?
This would make for proper data analysis without exploding the API usage
Variable name sum
might create some issues down the line as it is also a built-in function name.
tweets_analyzer/tweets_analyzer.py
Line 266 in 0251238
Hi, using your example of Snowden's account I'm getting the following.
./tweets_analyzer.py -n Snowden -l 100
[+] Getting @Snowden account data...
[+] lang : en
[+] geo_enabled : False
[+] time_zone : Eastern Time (US & Canada)
[+] utc_offset : -18000
[+] statuses_count : 1719
[+] Retrieving last 100 tweets...
0%| | 0/100 [00:00<?, ?tw/s]
[!] Error: '1'
I've configured secrets.py and secrets.pyc exists. All of the requirements installed correctly. Any idea what I've done wrong?
Any option ? Not supported yet?
I'm getting error when executing the command:
python tweets_analyzer.py -n Snowden
[+] Getting @Snowden account data...
[!] Twitter error: Failed to send request: 'Extensions' object has no attribute 'get_extension_for_class'
Python version: 2.7.9
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.