Join us

ContentUpdates and recent posts about Magika..
Link
@varbear shared a link, 1 month, 3 weeks ago
FAUN.dev()

The bug that taught me more about PyTorch than years of using it

A sneaky bug inPyTorch’s MPS backendlet non-contiguous tensors silently ignore in-place ops likeaddcmul_. That’s optimizer-breaking stuff. The culprit? ThePlaceholder abstraction- meant to handle temp buffers under the hood - forgot to actually write results back to the original tensor... read more  

The bug that taught me more about PyTorch than years of using it
Link
@varbear shared a link, 1 month, 3 weeks ago
FAUN.dev()

uv is the best thing to happen to the Python ecosystem in a decade

uvis a new Rust-powered CLI from Astral that tosses Python versioning, virtualenvs, and dependency syncing into one blisteringly fast tool. It handles yourpyproject.tomllike a grown-up—auto-generates it, updates it, keeps your environments identical across machines. Need to run a tool once without t.. read more  

uv is the best thing to happen to the Python ecosystem in a decade
Link
@kaptain shared a link, 1 month, 3 weeks ago
FAUN.dev()

eBPF Beginner Skill Path

This hands-on path drops devs straight into writing, loading, and poking at basiceBPFprograms withlibbpf,maps, and those all-important kernel safety checks. It starts simple - with a beginner-friendly challenge - then dives deeper into theverifierand tools for runtime introspection... read more  

eBPF Beginner Skill Path
Link
@kaptain shared a link, 1 month, 3 weeks ago
FAUN.dev()

How to build highly available Kubernetes applications with Amazon EKS Auto Mode

Amazon EKS Auto Mode now runs the cluster for you—handling control plane updates, add-on management, and node rotation. It sticks to Kubernetes best practices so your apps stay up through node drains, pod failures, AZ outages, and rolling upgrades. It also respectsPod Disruption Budgets,Readiness Ga.. read more  

How to build highly available Kubernetes applications with Amazon EKS Auto Mode
Link
@kaptain shared a link, 1 month, 3 weeks ago
FAUN.dev()

Building a Kubernetes Platform — Think Big, Think in Planes

Thinking in planes, as introduced by the Platform Engineering reference model, helps teams describe their platform in a simple, shared language, turning a collection of tools into a platform. It forces you to think horizontally, connecting teams and technologies instead of adding more layers, creati.. read more  

Link
@kaptain shared a link, 1 month, 3 weeks ago
FAUN.dev()

Helm 4 Overview

Helm 4 ditches the old plugin model for a sharper, plugin-first architecture powered by WebAssembly. That means isolation/control, and deeper customization - if you're ready to adapt! Post-renderers are now plugins. That breaks compatibility with earlier exec-based setups, so expect some rewiring. .. read more  

Link
@kaptain shared a link, 1 month, 3 weeks ago
FAUN.dev()

Unlocking next-generation AI performance with Dynamic Resource Allocation on Amazon EKS and Amazon EC2 P6e-GB200

Amazon just droppedEC2 P6e-GB200 UltraServers, packingNVIDIA GB200 Grace Blackwellchips. Built for running trillion-parameter AI models onAmazon EKSwithout losing sleep over scaling. Under the hood:NVLink 5.0,IMEX, andEFAv4stitch up to 72 Blackwell GPUs into one memory-coherent cluster per UltraServ.. read more  

Unlocking next-generation AI performance with Dynamic Resource Allocation on Amazon EKS and Amazon EC2 P6e-GB200
Link
@kaptain shared a link, 1 month, 3 weeks ago
FAUN.dev()

The State of OCI Artifacts for AI/ML

OCI artifacts quietly leveled up. Over the last 18 months, they’ve gone from a niche hack to production muscle for AI/ML workloads on Kubernetes. The signs? Clear enough:KitOpsandModelPacklanded in the CNCF Sandbox. Kubernetes 1.31 got native support forImage Volume Source. Docker pushedModel Runner.. read more  

The State of OCI Artifacts for AI/ML
Link
@kala shared a link, 1 month, 3 weeks ago
FAUN.dev()

Build AI Agents Worth Keeping: The Canvas Framework

MIT and McKinsey found a gap the size of the Grand Canyon: 80% of companies claim they’re using generative AI, but fewer than 1 in 10 use cases actually ship. Blame it on scattered data, fuzzy goals, and governance that's still MIA. A new stack is stepping in:product → agent → data → model. It flips.. read more  

Build AI Agents Worth Keeping: The Canvas Framework
Link
@kala shared a link, 1 month, 3 weeks ago
FAUN.dev()

Detect inappropriate images in S3 with AWS Rekognition + Terraform

A serverless AWS pipeline runs image moderation on autopilot - withS3,Lambda,Rekognition,SNS, andEventBridgeall wired up throughTerraform. When a photo gets flagged, it’s tagged, maybe quarantined, and triggers an email alert. Daily scan? Handled... read more  

Detect inappropriate images in S3 with AWS Rekognition + Terraform
Magika is an open-source file type identification engine developed by Google that uses machine learning instead of traditional signature-based heuristics. Unlike classic tools such as file, which rely on magic bytes and handcrafted rules, Magika analyzes file content holistically using a trained model to infer the true file type.

It is designed to be both highly accurate and extremely fast, capable of classifying files in milliseconds. Magika excels at detecting edge cases where file extensions are incorrect, intentionally spoofed, or absent altogether. This makes it particularly valuable for security scanning, malware analysis, digital forensics, and large-scale content ingestion pipelines.

Magika supports hundreds of file formats, including programming languages, configuration files, documents, archives, executables, media formats, and data files. It is available as a Python library, a CLI, and integrates cleanly into automated workflows. The project is maintained by Google and released under an open-source license, making it suitable for both enterprise and research use.

Magika is commonly used in scenarios such as:

- Secure file uploads and content validation
- Malware detection and sandboxing pipelines
- Code repository scanning
- Data lake ingestion and classification
- Digital forensics and incident response