🥤 Your dream of a soda pool is just a Fanta sea
Late to the Party 🎉 is about insights into real-world AI without the hype.
Hello internet,
aah, I’m back home, but already getting ready for more shenanigans! Let’s enjoy some machine learning.
The Latest Fashion
- Google is merging their AI divisions Brain and Deepmind into Deepmind
- The computer vision tutorials in Jupyter seem very high quality
- The Electronic Frontier Foundation has released a statement about AI copyright
Got this from a friend? Subscribe here!
My Current Obsession
I was at the Collaborations Workshop, which was great. I met a lot of the other fellows of the Software Sustainability Institute, and it was so lovely. It was the first actual hybrid event; to me, it was a game-changer. It’s pretty incredible when you can see how it can be!
During the Hack Day, we came up with the idea of Research Cards, which is a way for researchers to aggregate multiple research artefacts into a central, standardised format that can then be used for additional information and analysis. You can see the JSON is a first prototype of the whole vision. Basically, it’s supposed to be modular and extensible with internationalisation front of mind. Then we can add fields like a lay summary, limitations and ethical implications. Wouldn’t that be useful? I think so, at least!
Thing I Like
I bring one power outlet adaptor and this brick with me, and that’s a huge space-saver, and it has decent USB charging and multiple full-sized plugs. Often I can even help someone because they forgot their adaptor!
Machine Learning Insights
Last week I asked, What is bagging, and can you name examples of its usage?, and here’s the gist of it:
Bagging is a popular machine learning ensemble technique that involves creating multiple models trained on different subsets of the same training data.
The models’ predictions are combined to make a final prediction, which can be more accurate and robust than individual models’ predictions. The training data is randomly sampled with replacement to create these subsets, creating multiple bootstrap samples. Each subset is used to train a different model, and their predictions are combined through averaging or majority voting.
The primary advantage of bagging is its ability to reduce overfitting, which is a common problem in machine learning. Overfitting occurs when a model is too complex and fits the training data too closely, resulting in poor generalisation to unseen data. Bagging helps combat this issue by capturing different patterns and relationships within the data, leading to a more accurate and robust final prediction.
One example of using bagging is in weather prediction. Historical meteorological data can be divided into subsets and used to train multiple models. These models are then combined to create a more accurate prediction of future weather conditions. Bagging is an effective technique that can help improve the performance of machine learning models and enable better predictions in various sectors.
Question of the Week
- How do you assign new customers to a segment having minimal initial data?
Post them on Mastodon and Tag me. I’d love to see what you come up with. Then I can include them in the next issue!
Tidbits from the Web
- I loved this discussion about AI art. I don’t always agree, but it’s worth it for perspective!
- Is it a plane? Is it a bird? Is it a supermassive black hole ejected from its galaxy flung across time and space?!
- I like this idea of plastic-free packaging!
Jesper Dramsch is the creator of PythonDeadlin.es, ML.recipes, data-science-gui.de and the Latent Space Community.
I laid out my ethics including my stance on sponsorships, in case you're interested!