March Madness: Quantum Leaps, AI Bans, and the Eternal Struggle Against Laziness (It’s a Season, Apparently)

Ah, March, my birth month. The month that’s basically a seasonal identity crisis. In the Northern Hemisphere, it’s spring! Birds are chirping, flowers are contemplating. Down south? It’s autumn, leaves are falling, and pumpkin spice lattes are back on the menu. Way back in the day, the Romans were like, ‘Hey, let’s start the year now!’ Because why not? Time is a construct.

Speaking of constructs, what about quantum computing, which is basically time travel for nerds. China just dropped the Zuchongzhi 3.0, a quantum chip that’s apparently one quadrillion times faster than your average supercomputer. Yes, quadrillion. I had to Google that too. It’s basically like if your toaster could solve the meaning of life in the time it takes to burn your toast.

This chip is so fast, it made Google’s Sycamore (last months big deal) look like a dial-up modem. They did some quantum stuff, beat Google’s previous record, and everyone’s like, ‘Whoa, China’s winning the quantum race!’ Which, by the way, is a marathon, not a sprint. More like a marathon where everyone’s wearing jetpacks and occasionally tripping over their own shoelaces.

Now, while China’s busy building quantum toasters, the US is busy banning Chinese AI. DeepSeek, an AI startup, got the boot from all government devices. Apparently, they’re worried about data leaking to the Chinese Communist Party. Which, fair enough. Though, not sure what the difference is between being leaked and outright stolen, which is what the yanks do.

DeepSeek’s AI models are apparently so good, they’re scaring everyone, including investors, who are now having panic attacks about Nvidia’s stock. Even Taiwan’s like, ‘Nope, not today, DeepSeek!’ And South Korea and Italy are hitting the pause button. It’s like a global AI cold war, but with more awkward silences and fewer nukes (hopefully).

And here’s the kicker: even the Chinese are worried! DeepSeek’s employees had to hand over their passports to prevent trade secrets from leaking. Maybe Chinese passports have an email function? It’s like a spy thriller, but with more lines of code and less martinis.

So, what’s the moral of this story? March is a wild month. Quantum computers are basically magic. AI is scaring everyone. And apparently, data privacy is like a hot potato, and everyone’s trying not to get burned. Also, don’t forget that time is a construct.

Oh, and if you’re feeling lazy, just remember, even quantum computers have to work hard. So get off your couch and do something productive. Or, you know, just watch cat videos. Whatever floats your boat.

The UK Workplace: Agile Illusion and the Rise of AI-Powered Efficiency

Speaking honestly, the world of work isn’t what it used to be. Remember when stability and routine were the golden tickets? Just turning up constituted a job. Those days are fading fast. Today, we’re navigating a landscape of constant change – technological advancements, shifting market trends, and, yes, even global pandemics. It’s a whirlwind, and the only way to stay afloat is to embrace adaptability.

We’ve seen the rise of remote work, the acceleration of digital transformation, and the increasing demand for skills that didn’t even exist a two years ago. An overpriced degree takes four years to achieve? If you’re still clinging to outdated methods or resisting change, you’re likely to get left behind.

So let’s cut through the fluff: the UK workplace is stuck in a rut. Everyone’s talking about ‘adaptability,’ but in reality, there’s a gaping chasm between the buzzwords and actual practice. Agile? More like ‘fragile.’ We’re drowning in terminology, but the fundamental culture of British business remains stubbornly resistant to real change.

Laziness? Yes, I said it. A culture of complacency permeates far too many organizations. My recent contract was a prime example: an army of cooks, both from the consultancy and client sides, all stirring a pot that barely needed a simmer. Three React Native developers for a simple app? Four .NET developers to copy and paste a BFF? With a completely separate infrastructure team for a very basic integration? It was a circus of inefficiency.

While these legions of underutilised developers were busy pretending to be productive, I was building a working app using Windsurf by Codeium. And right now, Gemini is helping me create a serverless backend in Firebase. The contrast is stark, and it’s infuriating.

Here’s the truth: we’ve reached a tipping point. With the rapid advancement of AI, the traditional roles of developers are becoming increasingly redundant. I firmly believe that a skilled Business Analyst and Project Manager, armed with AI tools, are now all you need for a product build.

Imagine this: detailed requirements gathered through stakeholder interviews, translated into a prototype using AI. Employee workshops to refine the design. A final stakeholder sign-off. Then, a focus group of customers or end-users for a final review. A focused development phase, rigorous testing for non-functional requirements, and a release. Yes, there will be a month of rapid iterative re-releases as the product encounters the real world, but this is Agile in practice.

This isn’t just about efficiency; it’s about survival. The UK workplace needs a radical shake-up. We need to ditch the bloated teams and embrace the power of AI to streamline development. We need to stop paying lip service to Agile and start implementing it in a meaningful way.

The era of ‘cooks in the kitchen’ is over. It’s time for a revolution, and AI is leading the charge.

Call to Action:

Do you agree? Is the UK workplace lagging behind? Share your thoughts and experiences in the comments below. Let’s start a conversation.

From Trenches to Terminus: A Century of Warfare’s Chilling Evolution

A century. The span of a modern human lifetime, yet in the realm of warfare, it’s a chasm of unimaginable transformation. From the mud-soaked trenches of World War I to the sterile, algorithm-driven battlefields of today, the face of conflict has been irrevocably altered. In February, I spent a morning immersed in John Akomfrah’s ‘Mimesis: African Soldier’ exhibition at Glasgow’s Gallery of Modern Art, confronted by the visceral realities of a war fought with flesh and bone, a war where the majority of stories remain untold. Now, we face a future where war is waged by machines, where the human cost is both diminished and amplified in terrifying new ways.

The Echoes of WWI and Akomfrah’s “Mimesis”:

Akomfrah’s multi-screen installation is a haunting reminder of war’s human toll, especially for those whose sacrifices were systematically erased from history. The archival footage, the flowing water over forgotten faces, the montage of fragmented narratives – it all speaks to the chaos, the brutality, and the enduring trauma of conflict. WWI, with its trenches, its mustard gas, its sheer, senseless slaughter, was a war fought with rudimentary technology and an almost medieval disregard for human life. The images of African soldiers within ‘Mimesis’ forces us to consider the colonial aspects of these wars, and the many who fought and died who were not given a voice. The experience left me with a profound sense of the weight of history, a history often obscured by the dominant narratives.

The Rise of the Machines:

Fast forward to today, and the battlefield is a landscape of drones, AI, and robotic dogs armed with rocket launchers. The recent Ministry of Defence trials, showcasing robot dogs defusing bombs and drones autonomously detecting threats, paint a starkly different picture. We’re told these advancements ‘minimise human exposure to danger,’ that they ‘enhance Explosive Ordnance Disposal capability.’ But what about the ethical implications? What about the dehumanisation of conflict?

These robotic dogs, these AI-driven drones, they’re not just tools; they’re symbols of a profound shift in how we wage war. China’s deployment of advanced robotic dogs, designed to ‘change the approach to military operations,’ underscores this reality. The ‘precision movements’ and ‘remote classification of threats’ touted by defence officials mask a chilling truth: we’re entering an era where machines make life-or-death decisions.

Juxtaposition and Reflection:

The stark contrast between the human-centric horrors of WWI, as depicted in Akomfrah’s work, and the cold, calculated efficiency of modern robotic warfare is deeply unsettling. Where once soldiers faced each other across no man’s land, now machines engage in silent, unseen battles. The human element, once the defining feature of war, is being systematically removed.

This isn’t just about technological advancement; it’s about a fundamental, unsettling shift in our relationship with conflict. The distance created by these technologies—the drones, the remote-controlled robots, the AI-driven targeting systems—allows us to detach, to view war as a series of data points and algorithms, almost like a high-stakes video game. In fact, some of the footage we see now, with its crisp, digital clarity and detached perspective, bears an uncanny resemblance to scenes from ‘Call of Duty.’ But while the on-screen action might feel like entertainment, the consequences – the lives lost, the communities destroyed – remain as devastatingly real as ever. The danger lies in this blurred line, where the visceral horror of war is replaced by the sterile, almost gamified experience, potentially desensitizing us to the true cost of human conflict.

As we stand on the precipice of this new era, with growing global tensions, escalating trade conflicts, and the chilling specter of nuclear weapons being openly discussed, the threat of a third world war looms larger than ever. Yet, amidst this existential dread, we seem more preoccupied with petty snipes at Trump and the fleeting triumphs of social media one-upmanship. It’s a surreal disconnect. We must ask ourselves: what does it truly mean to wage war in the age of AI, when the very fabric of our reality is being reshaped by algorithms and automation? Are we genuinely safer, or are we merely constructing new and more insidious forms of peril, where the line between virtual and real becomes dangerously blurred? Akomfrah’s art compels us to confront the ghosts of past conflicts, the human stories buried beneath the rubble of history. The robotic dogs, with their cold, mechanical efficiency, force us to confront a future where human agency is increasingly questioned. Both past and future demand that we grapple with the human cost of conflict, in all its evolving forms, while simultaneously challenging our collective capacity for distraction and denial.

From the mud-soaked trenches of World War I to the sterile, digital battlefields of today, warfare has undergone a radical transformation, a transformation that now feels less like a distant future and more like a chilling present. For forty years, we’ve joked about the Terminator, about Skynet, about the rise of the machines, dismissing it as mere science fiction. But as we witness the deployment of AI-driven robotic dogs and the increasing gamification of conflict, that once-fantastical vision suddenly feels disturbingly real. The human capacity for both creation and destruction remains a constant, but the tools at our disposal have changed dramatically. As we embrace the technological advancements that promise to reshape our world, we can no longer afford to be detached observers, scrolling through social media while global tensions escalate. We must confront the ethical dilemmas that haunt us, the stories that have been silenced, and the very real possibility that the future we once laughed about is now upon us. The future of warfare is not just about machines; it’s about the choices we make as humans, choices that will determine whether we become the masters of our technology or its victims.

Apple and Google: A Forbidden Love Story, with AI as the Matchmaker

Well, butter my biscuits and call me surprised! Apple, the company that practically invented the walled garden, has just invited Google, its long-standing frenemy, over for a playdate. And not just any playdate – an AI-powered, privacy-focused, game-changing kind of playdate.

Remember when Apple cozied up to OpenAI, and everyone assumed ChatGPT was going to be the belle of the Siri-ball? Turns out, Apple was playing the field, secretly testing both ChatGPT and Google’s Gemini AI. And guess who stole the show? Yep, Gemini. Apparently, it’s better at whispering sweet nothings into Siri’s ear, taking notes like a diligent personal assistant, and generally being the brains of the operation.

So, what’s in it for these tech titans?

Apple’s Angle:

  • Supercharged Siri: Let’s face it, Siri’s been needing a brain transplant for a while now. Gemini could be the upgrade that finally makes her a worthy contender against Alexa and Google Assistant.
  • Privacy Prowess: By keeping Gemini on-device, Apple reinforces its commitment to privacy, a major selling point for its users.
  • Strategic Power Play: This move gives Apple leverage in the AI game, potentially attracting developers eager to build for a platform with cutting-edge AI capabilities.

Google’s Gains:

  • iPhone Invasion: Millions of iPhones suddenly become potential Gemini playgrounds. That’s a massive user base for Google to tap into.
  • AI Dominance: This partnership solidifies Google’s position as a leader in the AI space, showing that even its rivals recognize the power of Gemini.
  • Data Goldmine (Maybe?): While Apple insists on on-device processing, Google might still glean valuable insights from anonymized usage patterns.

The Bigger Picture:

This unexpected alliance could shake up the entire tech landscape. Imagine a world where your iPhone understands your needs before you even ask, where your notes practically write themselves, and where privacy isn’t an afterthought but a core feature.

But let’s not get ahead of ourselves. There are still questions to be answered. How will this impact Apple’s relationship with OpenAI? Will Google play nice with Apple’s walled garden? And most importantly, will Siri finally stop misinterpreting our requests for pizza as a desire to hear the mating call of a Peruvian tree frog?

Only time will tell. But one thing’s for sure: this Apple-Google AI mashup is a plot twist no one saw coming. And it’s going to be a wild ride.

The Agile Phone Line is Cracking Up: Is it Time to Hang Up?

Ah, March 3rd, 1876. A momentous date indeed, when Alexander Graham Bell first summoned Mr. Watson through the magic of the telephone. A groundbreaking invention that revolutionized communication and paved the way for countless innovations to come. But amidst our celebration of this technological milestone, let’s turn our attention to a more recent communication phenomenon: Agile.

Agile, that wondrous methodology that promised to streamline software development and banish the demons of waterfall projects, has become as ubiquitous as the telephone itself. Stand-up meetings, sprints, and scrum masters are now the lingua franca of the tech world, a symphony of buzzwords and acronyms that echo through the halls of countless software companies. But as we reflect on the legacy of the telephone and its evolution, perhaps it’s time to ask ourselves: Is Agile starting to sound a bit like a dial-up modem in an age of broadband?

Remember Skype? That once-beloved platform that connected us across continents, now destined for the digital graveyard on May 5th. Skype, like Agile, was once a revolutionary tool, but time and technology march on. Newer, shinier platforms have emerged, offering more features, better integration, and a smoother user experience. Could the same fate await Agile? With the rise of AI, machine learning, and automation, are we approaching a point where the Agile methodology, with its emphasis on human interaction and iterative development, becomes obsolete?

Perhaps the Agile zealots will scoff at such a notion, clinging to their scrum boards and burn-down charts like a security blanket. But the writing may be on the wall. As AI takes on more complex tasks and automation streamlines workflows, the need for constant human intervention and feedback loops might diminish. The Agile circus, with its daily stand-ups and endless retrospectives, could become a relic of a bygone era, a quaint reminder of a time when humans were still the dominant force in software development.

And speaking of communication, who could forget the ubiquitous “mute button” phenomenon? That awkward silence followed by a chorus of “You’re on mute!” has become a staple of virtual meetings, a testament to our collective struggle to adapt to the digital age. It’s a fitting metaphor for the challenges of communication in an Agile world, where information overload and constant interruptions can make it difficult to truly connect and collaborate.

So, as we raise a glass to Alexander Graham Bell and his telephonic triumph, let’s also take a moment to reflect on the future of Agile. Is it time to hang up on the old ways and embrace a new era of software development, one driven by AI, automation, and a more streamlined approach? Or can Agile adapt and evolve to remain relevant in this rapidly changing landscape? Only time will tell. But one thing is certain: the world of technology never stands still, and those who fail to keep pace risk being left behind, like a rotary phone in a smartphone world.

Agile: My Love-Hate Relationship with Iteration

Iteration. The word itself conjures up images of spinning wheels, cyclical patterns, and that hamster in its never-ending quest for… well, whatever a hamster sees in those wheels. But “iteration” is more than just a fancy word for “doing something again and again.” It’s a fundamental concept that permeates our lives, from the mundane to the profound.

Think about your morning routine. Wake up, stumble to the bathroom, brush your teeth (hopefully), make coffee (definitely). That’s an iteration, a daily ritual repeated with minor variations. Or consider the changing seasons, the ebb and flow of tides, the endless cycle of birth, growth, decay, and renewal. Iteration is the rhythm of existence, the heartbeat of the universe.

In the world of art and creativity, iteration takes center stage. Painters rework their canvases, musicians refine their melodies, writers revise their manuscripts – all in pursuit of that elusive perfect expression. Each iteration builds upon the last, refining, reimagining, and ultimately transforming the original concept into something new and hopefully improved.

But let’s not get all misty-eyed about iteration. It can be a cruel mistress, a source of frustration, a never-ending loop of “almost, but not quite.” Think about that DIY project that seemed so simple at first but has now become a Frankensteinian monster of mismatched parts and questionable design choices. Or that recipe you’ve tried a dozen times, each attempt yielding a slightly different (disastrous) result. Iteration, in these moments, feels less like progress and more like a punishment for our hubris.

And if we stretch it into the political arena, iteration takes on a particularly cynical flavor. The UK, with its revolving door of prime ministers, its endless Brexit debates, and its uncanny ability to elect leaders who promise change but deliver more of the same, is a prime example. Each election cycle feels like an iteration of the last, a Groundhog Day of broken promises, partisan squabbles, and that nagging sense that no matter who’s in charge, nothing really changes. Even the emergence of new parties, with their fresh faces and bold manifestos, often seems to get sucked into the same iterative loop, their initial idealism slowly eroded by the realities of power and the entrenched political system. Iteration, in this context, feels less like progress and more like a depressing reminder of our collective inability to break free from the past.

And then there’s Agile. Ah, Agile. The methodology that puts iteration on a pedestal, enshrining it as the holy grail of software development. Sprints, stand-ups, retrospectives – all designed to facilitate that relentless cycle of build, measure, learn. And while the Agile evangelists wax lyrical about the beauty of iterative development, those of us in the trenches know the truth: iteration can be a messy, chaotic, and often frustrating process.

We love iteration for its ability to adapt to change, to embrace uncertainty, to deliver value incrementally. We hate it for the endless meetings, the ever-growing backlog, the constant pressure to “fail fast” (which, let’s be honest, doesn’t always feel so fast). We love it for the sense of progress, the satisfaction of seeing a product evolve. We hate it for the scope creep, the shifting priorities, the nagging feeling that we’re building the plane as we fly it.

But love it or hate it, iteration is the heart of Agile. It’s the engine that drives innovation, the fuel that powers progress. And while it may not always be pretty, it’s undeniably effective. So, embrace the iteration, my friends. Embrace the chaos. Embrace the uncertainty. And maybe, just maybe, you’ll find yourself falling in love with the process, even if it’s a slightly dysfunctional, love-hate kind of love.

Wagile: In an iterative world, is there still a place for Waterfall

So Agile. It’s the buzzword du jour, the management mantra, the thing everyone’s been talking about for at least 10 years. Apparently, it is the antidote to all our project woes. Because, you know, Waterfall is so last century. And so, it seems, is cognitive function.

To be honest, Waterfall had a good run. Planning everything upfront, meticulously documenting every single detail, then… waiting. Waiting for the inevitable train wreck when reality collided with the perfectly crafted plan. It was like building a magnificent sandcastle, only to have the tide laugh maniacally and obliterate it. Ah fun times at Ridgemont High (aka RBS).

Agile, on the other hand, is all about embracing the chaos. Sprints, stand-ups, retrospectives – it’s a whirlwind of activity, a constant state of flux. Like trying to build that sandcastle while surfing the waves. Exhilarating? Maybe. Efficient? Debatable. Sane? No comment.

The Agile manifesto talks about “responding to change over following a plan.” Which is excellent advice, unless the change involves your entire development team suddenly deciding they’ve all become Scrum Masters or Product Owners. Then, your carefully crafted sprint plan goes out the window, and you’re left wondering if you accidentally wandered into a performance art piece.

And don’t even get me started on the stand-ups. “What did you do yesterday?” “What are you doing today?” “Are there any impediments?” It’s like a daily therapy session, except instead of delving into your inner demons, you’re discussing the finer points of code refactoring. And the “impediments”? Oh, the impediments. They range from “the coffee machine is broken” to “existential dread” (which is a constant in software development). It’s a rich tapestry of human experience, woven with threads of caffeine withdrawal and the gnawing fear that your code will spontaneously combust the moment you deploy it.

But the stand-up is just the tip of the iceberg, isn’t it? We’ve got the sprint planning, where we all gather around the backlog like it’s a mystical oracle, divining which user stories are worthy of our attention. It’s a delicate dance of estimation, negotiation, and the unspoken understanding that whatever we commit to now will inevitably be wildly inaccurate by the end of the sprint. We play “Planning Poker,” holding up cards with numbers that represent our best guesses at task complexity, secretly hoping that everyone else is as clueless as we are. It’s like a high-stakes poker game, except the only prize is more work.

Then there’s the sprint review, where we unveil our latest masterpiece to the stakeholders, praying that they won’t ask too many awkward questions. It’s a bit like showing your unfinished painting to an art critic, except the critic also controls your budget. We demonstrate the new features, carefully avoiding any mention of the bugs we haven’t fixed yet, and bask in the fleeting glow of (hopefully) positive feedback. It’s a moment of triumph, quickly followed by the realization that we have another sprint review looming in two weeks.

And let’s not forget the retrospective, the post-mortem of the sprint. We gather in a circle, armed with sticky notes and a burning desire to improve (or at least to vent our frustrations). We discuss what went well, what went wrong, and what we can do differently next time. It’s a valuable exercise in self-reflection, often culminating in the profound realization that we’re all just trying our best in a world of ever-changing requirements and impossible deadlines. It’s like group therapy, except instead of leaving feeling lighter, you leave with a list of action items and a renewed sense of impending doom. Because, you know, Agile.

But, amidst the chaos, the sprints, the stand-ups, there’s a glimmer of something… maybe… progress? Just maybe, Agile isn’t completely bonkers. Perhaps it’s a way to navigate the ever-changing landscape of software development, a way to build sandcastles that can withstand the occasional rogue wave. Or maybe it’s just a really elaborate way to procrastinate on actually finishing the project.

Either way, one thing’s for sure: it’s certainly more entertaining than Waterfall. And who knows, maybe in the process, we’ll all be forced to downgrade our cognitive functions to “basic operating level.” Who needs advanced cognitive functions when you have Agile and AI?

But amidst the gentle ribbing and self-deprecating humour, there is a serious point here. Agile, like any methodology, isn’t a magic bullet. It’s a tool, and like any tool, it can be used effectively or ineffectively. The key is understanding where Agile truly shines, where it needs to be adapted, and where – a touch of Waterfall might actually be the right approach.

That’s where I come in. With years of experience navigating the Agile landscape (and yes, even surviving a few Waterfall projects in my time), I can help your organisation cut through the jargon, identify the real pain points, and implement solutions that actually deliver results. Whether you’re struggling with sprint planning, drowning in a sea of sticky notes, or simply wondering if all this Agile stuff is worth the hassle, I can provide clarity, guidance, and a healthy dose of pragmatism. Because ultimately, it’s not about blindly following a methodology, it’s about finding the right approach to deliver value, achieve your goals, and maybe, just maybe, retain a little bit of your sanity in the process.

If you’re ready to move beyond the Agile buzzwords and build a truly effective development process, let’s talk.

From Zero to Data Hero: My Google Data Analytics Journey

Just a few short months ago, the world of data analytics felt like a vast, uncharted ocean. Now, after completing Google’s Data Analytics Professional Certificate (or at least the 12+ modules that make up the learning path – more on that later!), I feel like I’ve charted a course and am confidently navigating those waters. It’s been an intense, exhilarating, and sometimes head-scratching journey, but one I wouldn’t trade for anything.

My adventure began in October 2024, and by February (this week) 2025, I had conquered (most of) the learning path. Conquer is the right word, because it was definitely an intense learning curve. 2000’s dev junior SQL skills? Yeah, they got a serious dusting off. And my forgotten Python, which was starting to resemble ancient hieroglyphics? Well, let’s just say we’re on speaking terms again.

The modules covered a huge range of topics, from the foundational “Introduction to Data Analytics on Google Cloud” and “Google Cloud Computing Foundations” to more specialized areas like “Working with Gemini Models in BigQuery,” “Creating ML Models with BigQuery ML,” and “Preparing Data for ML APIs on Google Cloud.” (See the full list at the end of this post!) Each module built upon the previous one, creating a solid foundation for understanding the entire data analytics lifecycle.

But the real stars of the show for me were BigQuery and, especially, Looker Studio. I’ve dabbled with other data visualization tools in the past (mentioning no names… cough Microsoft cough Tableau cough), but Looker Studio blew me away. It’s intuitive, powerful, and just… fun to use. Seriously, I fell in love. The ease with which you can connect to data sources and create insightful dashboards is simply unmatched. It’s like having a superpower for data storytelling!

One of the biggest “aha!” moments for me was realizing the sheer power of data insights. Mining those hidden gems from large datasets is incredibly addictive. And the fact that Google makes it so easy to access public datasets through BigQuery? Game changer. It’s like having a data goldmine at your fingertips.

This learning path has ignited a real passion within me. So much so that I’m now pursuing a Data Analysis Diploma, which I’m hoping to wrap up before June. And, because I apparently haven’t had enough learning, I’m also signing up for the Google Cloud Data Analytics Professional Certificate. I’m all in!

I have to say, the entire Google Cloud platform just feels so much more integrated and user-friendly compared to the Microsoft offerings I’ve used. Everything works together seamlessly, and the learning resources are top-notch. If you’re considering a career in data analytics, I would wholeheartedly recommend the Google path over other options.

I’m especially excited to dive deeper into the machine learning aspects. And the integration of Gemini? Genius! Having it as a code buddy has been a huge help, especially when I’m wrestling with a particularly tricky SQL query or trying to remember the correct syntax for a Python function. Seriously, it’s like having a data analytics guru by my side.

Stay tuned for future posts where I’ll be sharing more about my data analytics journey, including tips and tricks, project updates, and maybe even some data visualizations of my own!

Coursera do an official course = https://www.google.com/url?sa=E&source=gmail&q=https://www.coursera.org/professional-certificates/google-data-analytics – this you get a recognised formal professional certificate.

Or jump into Google Cloud Skills Boost: https://www.cloudskillsboost.google/ and get yourself a Cloud account and friendly with Gemini.

Modules Completed:

  • Work with Gemini Models in BigQuery
  • Analyzing and Visualizing Data in Looker Studio
  • BigQuery for Data Analysts
  • Boost Productivity with Gemini in BigQuery
  • Create ML Models with BigQuery ML
  • Derive Insights from BigQuery Data
  • Developing Data Models with LookML
  • Google Cloud Computing Foundations- Data, ML, and AI in Google Cloud
  • Introduction to Data Analytics on Google Cloud
  • Manage Data Models in Looker
  • Prepare Data for Looker Dashboards and Reports
  • Prepare Data for ML APIs on Google Cloud

Ignite Your Own ‘Aha!’ Moments: Lessons from Edison

October 21st, 1879. Thomas Edison, weary-eyed but determined, watching a humble carbon filament glow steadily in a glass bulb. It wasn’t the first incandescent light, but it was the first practical one, a breakthrough that illuminated the path to the electrified world we know today. Imagine that feeling – the surge of triumph, the “aha!” moment that changed everything.

Edison’s invention wasn’t just about brighter nights; it sparked a revolution. Factories could hum around the clock, homes became havens of comfort, and cities transformed into glittering landscapes. But that initial spark, that flash of inspiration, is something we all experience, isn’t it?

Think about your own “light bulb moments” – that sudden realization when solving a tricky problem, the innovative idea that takes your breath away, or even the simple joy of understanding a complex concept for the first time. These moments, big or small, are the engines of progress, the catalysts for change.

145 years after Edison’s breakthrough, we’re surrounded by the descendants of his genius. But the spirit of innovation hasn’t dimmed. Today, our “light bulb moments” are powered by algorithms, fueled by data, and manifested in the smart devices that fill our lives.

Imagine this: you walk into your home, and the lights adjust to your preferred setting, the thermostat knows your ideal temperature, and your favorite music starts playing softly. This isn’t science fiction; it’s the reality of smart home technology, a testament to countless “aha!” moments that have built upon Edison’s legacy.

From voice assistants that anticipate our needs to AI-powered apps that personalize our experiences, technology continues to evolve at an astonishing pace. And behind every innovation, every leap forward, is a human being experiencing that same thrill of discovery, that same “light bulb moment” that Edison felt 145 years ago.

So the next time you have a flash of brilliance, no matter how small, remember that you’re part of a long lineage of innovators, stretching back to that dimly lit room in Menlo Park. Embrace that “aha!” moment, nurture it, and let it shine. Who knows? You might just spark the next revolution.

So Long, and Thanks for All the Algorithms (Probably)

The Guide Mark II says, “Don’t Panic,” but when it comes to the state of Artificial Intelligence, a mild sense of existential dread might be entirely appropriate. You see, it seems we’ve built this whole AI shebang on a foundation somewhat less stable than a Vogon poetry recital.

These Large Language Models (LLMs), with their knack for mimicking human conversation, consume energy with the same reckless abandon as a Vogon poet on a bender. Training these digital behemoths requires a financial outlay that would make a small planet declare bankruptcy, and their insatiable appetite for data has led to some, shall we say, ‘creative appropriation’ from artists and writers on a scale that would make even the most unscrupulous intergalactic trader blush.

But let’s assume, for a moment, that we solve the energy crisis and appease the creative souls whose work has been unceremoniously digitised. The question remains: are these LLMs actually intelligent? Or are they just glorified autocomplete programs with a penchant for plagiarism?

Microsoft’s Copilot, for instance, boasts “thousands of skills” and “infinite possibilities.” Yet, its showcase features involve summarising emails and sprucing up PowerPoint presentations. Useful, perhaps, for those who find intergalactic travel less taxing than composing a decent memo. But revolutionary? Hardly. It’s a bit like inventing the Babel fish to order takeout.

One can’t help but wonder if we’ve been somewhat misled by the term “artificial intelligence.” It conjures images of sentient computers pondering the meaning of life, not churning out marketing copy or suggesting slightly more efficient ways to organise spreadsheets.

Perhaps, like the Babel fish, the true marvel of AI lies in its ability to translate – not languages, but the vast sea of data into something vaguely resembling human comprehension. Or maybe, just maybe, we’re still searching for the ultimate question, while the answer, like 42, remains frustratingly elusive.

In the meantime, as we navigate this brave new world of algorithms and automation, it might be wise to keep a towel handy. You never know when you might need to hitch a ride off this increasingly perplexing planet.

Comparison to Crypto Mining Nonsense:

Both LLMs and crypto mining share a striking similarity: they are incredibly resource-intensive. Just as crypto mining requires vast amounts of electricity to solve complex mathematical problems and validate transactions, training LLMs demands enormous computational power and energy consumption.

Furthermore, both have faced criticism for their environmental impact. Crypto mining has been blamed for contributing to carbon emissions and electronic waste, while LLMs raise concerns about their energy footprint and the sustainability of their development.

Another parallel lies in the questionable ethical practices surrounding both. Crypto mining has been associated with scams, fraud, and illicit activities, while LLMs have come under fire for their reliance on massive datasets often scraped from the internet without proper consent or attribution, raising concerns about copyright infringement and intellectual property theft.

In essence, both LLMs and crypto mining represent technological advancements with potentially transformative applications, but they also come with significant costs and ethical challenges that need to be addressed to ensure their responsible and sustainable development.