by
Tweet Hunter
Tweet Hunter analyzes huge number of tweets and tries to extract the most trending topics so you always know what to tweet about to stay hot.
(kendrick, drake, the, to, is, and, lamar, kendrick lamar)
Best tweets on this topic
drake really sat in the studio for 24 hours just to say "if im a liar... why arent my pants on fire?"
I only want drake to keep replying so I can listen to more kendrick songs
𝕺 𝕮𝖔𝖗𝖛𝖔
@visecsKendrick Lamar got too much information son. J. Cole DEFINITELY dropped out the beef before he exposed his dreads are fake, he hiding 3 kids and he got a gluten allergy or some shit lmaooo
(kids, to, the, their, is, and, school, in)
Best tweets on this topic
A Kwanyama
@AlexiaSchlechteI wanna propose a bill that keeps children off social media until the age of 18. Parents can’t post their kids either.
Àbíkẹ́
@gamvchiraiSo Norway banned smartphones in schools and 3 years later, girls’ GPAs are up, their visits to mental health professionals are down 60% and bullying in both boys and girls is down 43-46%. That’s wild
African parents will use Nicolas Jackson as a moral lesson for the reason they keep telling you to barb lowcut. 😂😂
(ai, the, to, and, of, for, is, on)
Best tweets on this topic
Andrej Karpathy
@karpathy# CUDA/C++ origins of Deep Learning Fun fact many people might have heard about the ImageNet / AlexNet moment of 2012, and the deep learning revolution it started. https://t.co/2xjLWODMOf What's maybe a bit less known is that the code backing this winning submission to the contest was written from scratch, manually in CUDA/C++ by Alex Krizhevsky. The repo was called cuda-convnet and it was here on Google Code: https://t.co/ch137VSYZ4 I think Google Code was shut down (?), but I found some forks of it on GitHub now, e.g.: https://t.co/zYhzdUxoEN This was among the first high-profile applications of CUDA for Deep Learning, and it is the scale that doing so afforded that allowed this network to get such a strong performance in the ImageNet benchmark. Actually this was a fairly sophisticated multi-GPU application too, and e.g. included model-parallelism, where the two parallel convolution streams were split across two GPUs. You have to also appreciate that at this time in 2012 (~12 years ago), the majority of deep learning was done in Matlab, on CPU, in toy settings, iterating on all kinds of learning algorithms, architectures and optimization ideas. So it was quite novel and unexpected to see Alex, Ilya and Geoff say: forget all the algorithms work, just take a fairly standard ConvNet, make it very big, train it on a big dataset (ImageNet), and just implement the whole thing in CUDA/C++. And it's in this way that deep learning as a field got a big spark. I recall reading through cuda-convnet around that time like... what is this :S Now of course, there were already hints of a shift in direction towards scaling, e.g. Matlab had its initial support for GPUs, and much of the work in Andrew Ng's lab at Stanford around this time (where I rotated as a 1st year PhD student) was moving in the direction of GPUs for deep learning at scale, among a number of parallel efforts. But I just thought it was amusing, while writing all this C/C++ code and CUDA kernels, that it feels a bit like coming back around to that moment, to something that looks a bit like cuda-convnet.
Marques Brownlee
@MKBHDOn one hand: It seems like it's only a matter of time before Apple starts making major AI-related moves around the iPhone and iOS and buries these AI-in-a-box gadgets extremely quickly On the other hand: Have you used Siri lately?
Anthropic
@AnthropicAIThe Claude iOS app has arrived. The power of frontier intelligence is now in your back pocket. Download now on the App Store: https://apps.apple.com/us/app/claude/id6473753684 … pic.twitter.com/yabpNuziQz
(trading, you, your, the, and, to, trader, on)
Best tweets on this topic
Once you finally find your model... Trading will feel like an illegal money printer. You know when to execute, and when to sit back. You know what to look for, and when to look for it. You know how much to risk, and on which accounts. Cheatcode.
Tony Trades
@ScarfaceTrades_I made $100k in April trading. You want to know what I did? - I took 1 setup ONLY for 30 days. - I traded the same 1-3 tickers all month. - I took 2 trades on average daily. - I followed my plan for every trade. That’s it. Stop over complicating it, simplicity is key.
Lark Davis
@TheCryptoLarkIf you want to get rich investing, follow this mind-blowingly simple rule: Buy low and sell high. Sounds obvious, yet so many chase highs and ignore the lows. It doesn't get any simpler than this.
(followers, to, the, you, my, in, for, days)
Best tweets on this topic
impact ≠ going number 1 on spotify impact = touching the lives of millions of fans w their music enough to mobilize them for good causes, cementing their legacy as artists who inspire change in their fans, strenghtening community & solidarity between fans from across the world
Kori Wilson
@mskoriwilson3000 followers... I must be dreaming! I started building in late November. Made over $10K by March. And now I have 3K beautiful people. Believing in me. To anyone who thinks it can’t happen, it can. From no likes. To lots of likes. No followers. To amazing followers. No money. To good money. In 5 short months. Thanks to each of you. Feeling incredibly lucky today.
Content Inspiration, AI, scheduling, automation, analytics, CRM.
Get all of that and more in Tweet Hunter.
Try Tweet Hunter for free