[{"content":"","date":"April 20, 2026","externalUrl":null,"permalink":"/","section":"Anachronistic Monk","summary":"","title":"Anachronistic Monk","type":"page"},{"content":"","date":"April 20, 2026","externalUrl":null,"permalink":"/tags/essays/","section":"Tags","summary":"","title":"Essays","type":"tags"},{"content":" There\u0026rsquo;s a question I get asked often enough that I\u0026rsquo;ve started collecting answers for it: why do you shoot everything in black and white?\nThe honest answer is that I find something in a monochrome frame that color seems to smudge over. But \u0026ldquo;I just like it\u0026rdquo; is a terrible answer for anyone who actually wants to talk, so over time I\u0026rsquo;ve assembled a second, stranger justification; one that borrows from thermodynamics and from Shannon, and which I like precisely because it probably shouldn\u0026rsquo;t work.\nGibbs, briefly # In chemistry, the Gibbs free energy equation tells you whether a reaction will spontaneously move forward:\n$$ \\Delta G = \\Delta H - T \\Delta S $$The short, lossy version is this; if you want the reaction to be favorable, $$\\Delta G$$ has to be negative. And one of the easiest ways to get there is for the entropy of the system to increase, for $\\Delta S$ to be positive. The universe, famously, prefers disorder.\nThat\u0026rsquo;s the direction of things. Forward. More states, more dispersal, more entropy. The arrow of time is, in a sense, the arrow of entropy pointing outward.\nShannon, briefly # There\u0026rsquo;s a different entropy that lives in information theory. A fair coin can land in one of two states, so it carries one bit of information when it\u0026rsquo;s flipped. Two coins together can land in four combined states and carry two bits. Add a third and you\u0026rsquo;re at three bits, eight states. The more states a system can occupy; the more possibility, the more unpredictability; the higher its Shannon entropy. Information and uncertainty turn out to be the same currency.\nThis is where photographs come in.\nA color photograph is a lot of information # Consider what a color image carries. Every pixel is a vector; red, green, blue, each taking some value in some range; and the space of possible images is enormous. The information content is genuinely huge; the surface area of possibility vast.\nNow strip the color out. What survives? Lines, contours, light and its absence, geometry. The shape of a face. The weight of a shadow. An expression. In a good photograph, nothing essential leaves when the color does - we still read grief, or tenderness, or the way the afternoon leans against a wall. What\u0026rsquo;s gone is the RGB vector space. The axes have collapsed. The entropy, in Shannon\u0026rsquo;s sense, has dropped.\nYou are conveying the same meaning with fewer possible states. That is, by almost any reasonable definition, a reduction in entropy.\nThe conceit # Here is where I stretch the metaphor until it creaks, and I want to be honest that it creaks. The two entropies; thermodynamic and informational; are related but not interchangeable. You cannot literally reverse a chemical reaction by underexposing a roll of Tri-X. Boltzmann and Shannon are cousins, not the same person.\nBut imagine.\nImagine if every photograph taken in the world were a small monochrome act; a local subtraction of information, a pixel-by-pixel refusal of the RGB expansion. Imagine the universe, on aggregate, gently nudging its $\\Delta S$ toward zero. The forward reaction; the relentless march of more, faster, brighter, louder; becomes, by the slimmest margin, unfavorable.\nWhat then? We don\u0026rsquo;t go backward. Entropy doesn\u0026rsquo;t work like rewind. But perhaps we slow. Perhaps we stay. Perhaps the universe pauses, looks around, and decides it\u0026rsquo;s fine here, actually.\nOr perhaps, for a moment, we tip into a state that feels older. Simpler. The way a black and white still of your grandmother in 1960 feels more like a memory than a document.\nWhy I actually shoot this way # I know the physics is a conceit. I know the second law isn\u0026rsquo;t going to be repealed by my camera. But the metaphor tells a truth about why I shoot monochrome: because I want to make pictures that hold less and mean more. Because color is an engine of excess, and restraint is a kind of prayer. Because a frame that strips away the inessential is also a frame that says, this, this much, is enough.\nEvery photograph is a small argument about what to keep. Mine, it turns out, is an argument for keeping less.\nAnd if, by some very slim thermodynamic accident, a million of us arguing the same thing were to hold the universe still for half a second longer I\u0026rsquo;d take that trade.\n","date":"April 20, 2026","externalUrl":null,"permalink":"/posts/on-shooting-in-black-and-white/","section":"Posts","summary":" There’s a question I get asked often enough that I’ve started collecting answers for it: why do you shoot everything in black and white?\n","title":"On Shooting in Black and White (An Entropy Argument)","type":"posts"},{"content":"","date":"April 20, 2026","externalUrl":null,"permalink":"/tags/photography/","section":"Tags","summary":"","title":"Photography","type":"tags"},{"content":"","date":"April 20, 2026","externalUrl":null,"permalink":"/posts/","section":"Posts","summary":"","title":"Posts","type":"posts"},{"content":"","date":"April 20, 2026","externalUrl":null,"permalink":"/tags/","section":"Tags","summary":"","title":"Tags","type":"tags"},{"content":"","date":"December 5, 2025","externalUrl":null,"permalink":"/tags/language/","section":"Tags","summary":"","title":"Language","type":"tags"},{"content":"","date":"December 5, 2025","externalUrl":null,"permalink":"/tags/society/","section":"Tags","summary":"","title":"Society","type":"tags"},{"content":"“Call me Ishmael.”\nThat line opens one of literature’s great sweeping saga: a man, a whale, and the bruising, salt-earning adventure of simply not dying at sea. But the name does something subtle: like a polite knock on the front door of literature. It positions the narrator. In Sanskrit, nāma means exactly that: the thing by which you call someone from afar.\nThe internet insists the English “name” is Germanic. Fine. I’ll let that slide. There are forces who are anyway busy retrofitting half the dictionary into proud Indic ancestry, so I don’t need to contribute to that orchestra. I’m not here to litigate etymology. I’m here to question something we pretend is apolitical.\nAn honorific on the other hand is a title or form of address used to show respect for someone’s role or position for example, calling a ‘Professor’ or a \u0026lsquo;Wing Commander\u0026rsquo; or a \u0026lsquo;Doctor\u0026rsquo;.\n⸻—————————————————————————––\nThe American First-Name Democracy # My career began in Calcutta, where using someone’s first name was fine as long as you added a soft _da at the end. It was the outsider’s cultural cheat code. You sounded casual, but the suffix still signaled respect, affection, and that you were talking to an elder brother figure; the kind you butter up before asking for his bike keys to impress someone wildly out of your league.\nIn the United States, calling someone by their first name is considered a gesture of equality or at least a very convincing performance of equality. And to be fair, in a corporate environment, where speed, informality, and horizontal communication genuinely help getting the work done, first-name culture is not just convenient, it is operationally essential. But what works for corporate efficiency does not automatically map onto the terrain of social orders.\nIt’s also wildly practical. No one is expected to flawlessly pronounce “Dhritiman Chatterjee” or, for that matter, “Durwasa Chakraborty” without commiting phonetic felony.\nMany international students, especially from China, glide through campus with two names in effortless tandem: the Mandarin one, and the American one. Each one worn like a couture nom de plume, knitted perfectly for both the résumé and the room. (Alright, that’s my French quota for the year. Also, with those two words, my French vocabulary is now officially exhausted)\nFitzgerald, of course, wasn’t selling the American Dream to kids whose first names exceed the character limit on most US government forms. USCIS paperwork would quietly agree, and even the Starbucks barista, with a marker hovering in defeat, would offer a solemn nod. But that rant belongs to another chapter entirely.\nWhat troubles me is not the first-name basis itself, it has its place in corporate life, but the easy assumption that the same flattening principle must govern social relationships and social services. Naming hierarchies don’t become oppressive simply because a workplace thrives without them.\n⸻\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;\u0026mdash;-\nThe Flattening Where Structure Is Useful # The ethos of the first-name democracy has never fully resonated with me.\nI find it difficult, almost socially inconsiderate, to address a doctor without the title Doctor, a professor without Professor, or a Wing Commander without the rank that defines their professional identity.\nThis instinct is not a vestige of servility. It is not a residue of my colonial conditioning.\nIn a society as vast and structurally uneven as India, where the lower half of the socioeconomic pyramid alone exceeds the populations of many nations ; honorifics perform an essential social function. They acknowledge responsibility rather than superiority; they signal the weight of a role rather than the ego of the individual.\nA champion of equality reading this might argue that first-name culture symbolically flattens hierarchy and seems like a good healthy start for a classless society. Yet hierarchy does not dissolve simply because we decline to articulate it. On the contrary, the refusal to name it risks obscuring the very burdens and accountabilities that institutions place on individuals.\nAnd this is the distinction worth defending: First-name basis may be ideal for agile corporate structures, but social life :: education, mentorship, medicine, public services, requires more than speed; it requires signalling and respect.\nIn a country where the weight and impact of broader infrastructure of social labour remain chronically undervalued, the least we can do is recognize the office, not merely the occupant.\nTo call someone by the title is not just an act of reverence. It is an act of recognition.\n⸻—————————————————————————––\nSeven Words for ‘Teacher\u0026rsquo; # If anything, the British, however messed up their larger imperial legacy, were not entirely wrong in devising and institutionalizing such honorifics. They understood something about social architecture: that titles can function as stabilizers. They create clarity about roles, responsibilities, and the weight a person voluntarily agrees to carry.\nBut before you sprint off to dismiss me as a sucker for British imperial nostalgia, fanboying over etiquette imported alongside two centuries of plunder and paperwork, let’s remember something: the idea of honorifics is neither alien nor uniquely colonial. It has been part of our own cultural grammar long before the British or Europreans arrived.\nOur own traditions once carried a beautifully layered vocabulary for those who shape us:\nAdhyapak — the instructor. Shikshak — the one who teaches with understanding. Upadhyay — the explainer, the interpreter. Acharya — the one who teaches by living the lesson. Pandit — the master of a discipline. Dhrishta — the visionary. Guru — the one who guides you from darkness to light. As history tells it, Aristotle was Alexander’s teacher. But which teacher was he supposed to be?\nThe one who taught him philosophy?\nThe one who taught him to read?\nThe one who trained his reasoning with geometric shapes?\nThe one who explained the architecture of war?\nThe one who made him strategic enough to outmaneuver armies at Hydaspes (now known as Sindh)?\nOr the one who made him shrewd enough to sense a mutiny before it happened and neutralize a coup within his own ranks?\nAll of these are radically different roles, yet history compresses them into a single noun: teacher.\nThink of Naruto: Iruka and Kakashi are both sensei, yet they give Naruto entirely different things :: one gives him belonging; the other gives him direction. The title is the same, but the responsibility is different.\nThe Greeks had dozens of words for love, but not for teachers.\nAncient Greek is famous for its exquisitely nuanced vocabulary around love: agápē, érōs, philía, storgē, xenia, and so on. (Yes, I shamelessly copied the list from Pinterest.)\nBut for “teacher”? The language is surprisingly sparse.\nThe primary word was didaskalos, literally “the one who teaches.” There were related terms however, “didaskolos” remained a largely flat category.\nJapanese follows a similar pattern. In Japanese, 先生 (sensei) literally means “one who was born before” :: a person with seniority, wisdom, or expertise.\nThe term is broad, respectful, but not differentiated. It doesn’t distinguish between the instructor, the vanguard of morality, the visionary, or the spiritual guide.\n(Or maybe I’m completely wrong, and my so-called “expertise” is just the unholy fusion of too many Stephen Fry interviews on Greek mythologies and Japanese anime; none of which hardly qualifies as the gold standard for social and etymological essays.)\nIn other words:\nHierarchy isn’t inherently oppressive. Sometimes, it’s a way of paying attention to what exactly a person does for you.\nToday, however, we flatten everything into a single word: Guru. Anyone with chalk dust on their shirt, yoga pants on a mat, or ochre robes on a dais gets the title automatically.\nBut the person who teaches you calculus is not the same as the person who asks you who you want to become. If you’re lucky, truly lucky, you may find both in the same person. I did. And among the many teachers who have guided me, one in particular is the reason I write this super lengthy blog. And I know most people don’t, which only deepens my gratitude for the rare few who manage to shape your mind and steady your compass at the same time.\nNot every lecturer becomes a life-anchor.\nNot every educator becomes a compass.\nAnd they don’t have to.\nBut acknowledging layered roles helps position the student. It situates learning. It creates a quiet structure: I am here to be shaped, and you are here to take responsibility for that shaping.\n⸻—————————————————————————––\nWhy I Still Use Honorifics # Not because I believe in hierarchy as domination.\nBut because I believe in hierarchy as orientation.\nSome roles deserve acknowledgement :: not for the person, but for the labour they carry, the obligation they uphold, the weight their title silently bears.\nIn an age hoping towards egalitarianism, perhaps dignity does not lie in flattening everything, but in calling people by the name that honours their responsibility.\nCorporate life may run on a first-name basis. Society, however, runs on recognising who holds which burdens. One system optimises for speed; the other for meaning. They do not have to be in conflict, but they should not be confused.\nAnd that is why I still use honorifics; why “Sir,” “Professor,” “Doctor,” “Brigadier” come naturally to me. Not out of deference, not out of fear, but out of recognition. Out of gratitude. Out of a belief that some kinds of work deserve to be named, because naming is the smallest, simplest way of saying: I see what you carry.\nAnd before anyone panics: no, there is no Part B. I promise this franchise ends here._\n","date":"December 5, 2025","externalUrl":null,"permalink":"/posts/english-honorifics-and-in-academia/","section":"Posts","summary":"“Call me Ishmael.”\nThat line opens one of literature’s great sweeping saga: a man, a whale, and the bruising, salt-earning adventure of simply not dying at sea. But the name does something subtle: like a polite knock on the front door of literature. It positions the narrator. In Sanskrit, nāma means exactly that: the thing by which you call someone from afar.\n","title":"The Place of Honorifics in Modern Society Part A :: Academia","type":"posts"},{"content":"","date":"October 19, 2025","externalUrl":null,"permalink":"/tags/academia/","section":"Tags","summary":"","title":"Academia","type":"tags"},{"content":"","date":"October 19, 2025","externalUrl":null,"permalink":"/tags/ai/","section":"Tags","summary":"","title":"Ai","type":"tags"},{"content":"\u0026ldquo;With great power comes great responsibility.\u0026rdquo;\nVoltaire (and also every Spider-Man movie ever)\nThere\u0026rsquo;s this video of Chad Smith, the drummer from Red Hot Chili Peppers. He\u0026rsquo;s hearing a song for the first time, no prep, no notes, no second take. And yet somehow, he just gets it. He catches the groove like it\u0026rsquo;s muscle memory, then makes the whole thing sound better.That\u0026rsquo;s the magic of practice. Not the kind where you count hours, but the kind where you repeat something so many times it becomes your second nature, your reflex. Whether it\u0026rsquo;s drumming, coding, or explaining your PhD topic to your relatives without crying, the idea\u0026rsquo;s the same: do it till it\u0026rsquo;s boring, and then keep doing it till it\u0026rsquo;s beautiful.\nWhy I\u0026rsquo;m Writing This # \u0026ldquo;Give it away, now….\u0026rdquo;\nThis isn\u0026rsquo;t a psychiatrist\u0026rsquo;s reflection. Not a neuroscientist\u0026rsquo;s analysis. And definitely not one of those \u0026ldquo;10 ways to master your mind\u0026rdquo; Medium posts written by someone who just discovered cold showers. Sure, don\u0026rsquo;t air your dirty laundry in public but cataloging is how self-reflection begins. This is just me, a guy who left the industry after five years, thinking he was returning to academia armed with experience, purpose, and maybe even principles. But somewhere along the way, got distracted, mistook hype for progress, and forgot what he actually came here for: to become the best damn teacher of computer science he could be. I took the easy path when I should’ve been struggling and that’s on me.\nAgentic LLMs aren\u0026rsquo;t just powerful, they\u0026rsquo;re seductive. They don\u0026rsquo;t simply accelerate your work; they make you feel smarter while quietly stealing the struggle that actually teaches you something. It\u0026rsquo;s like outsourcing your gym reps and still expecting Greek God like phyisque. If you want the formal takes, you can read these:\narXiv:2506.08872 MDPI: LLMs and Knowledge Work What follows is my own messy field report, a note from someone relearning that reflection only counts when you show up and keep trying. I get the importance of LLMs, especially in industry, where speed means promotion and profit. So surely, build the next generation of regex(*). But this is about LLMs and the academic world, where the goal isn\u0026rsquo;t just to build faster, but get to level where certain CS skills feel second nature, just like the Chad Smith video.\nBack When Failing Was Learning # \u0026ldquo;Scar tissues that I wish you saw,\u0026rdquo;\nAbout eleven years ago, I picked up coding. Like every other kid who bombed a competitive exam, I thought: Fine, I\u0026rsquo;ll redeem myself in competitive programming. Here\u0026rsquo;s a fossil from that era:\nfeb_chefeq.c If you open it, you\u0026rsquo;ll see spaghetti code, random binaries (don\u0026rsquo;t ask me why I pushed those). I was young and dumb. (Now I\u0026rsquo;m just not young.) But here\u0026rsquo;s the thing ; even in that chaos, I was learning. Every WA, every TLE, every embarrassing commit left a small scar that made the lesson stick.\nTake this snippet for example:\nint compare(void const *p, void const *q) { return (*(int*)p - *(int*)q); } There’s no universe in which I could’ve written that myself and that’s fine. Back then, I was basically a raccoon rummaging through Stack Overflow, stealing shiny bits of code and gluing them together with misplaced confidence. And somehow, that counted as victory.\nTruth be told, I was terrible at competitive programming. Infact I landed my first Software Engineer job, not through brilliance, but through sheer repetition and stubbornness of trying and retrying leetcode questions.\nEven now, I’d probably sit comfortably in the bottom 10 percent of the CP world. But you know what? That’s okay. Because, in the hindsight the point was never to be good, better or best, it was to keep trying long enough.\nEnter the Agentic LLM (a.k.a. The Drug Phase) # \u0026ldquo;I got a bad disease, out from my brain is where I bleed…\u0026rdquo;\nFast forward a decade. Tools evolved.\nCalculators are tools. Google is a tool. Stack Overflow is a tool. Agentic LLMs? They\u0026rsquo;re not tools. They\u0026rsquo;re drugs. They don\u0026rsquo;t just help you; they replace you.\nAnd the dangerous part? They make you feel like you\u0026rsquo;ve understood something when all you\u0026rsquo;ve done is follow instructions from a very confident autocomplete. Take AVL trees. I asked an LLM to write one for me. The cycle went like this:\nAsk for code. Paste errors back into chat. Repeat. After 2-3 iterations, success! It compiles! What did I learn? Absolutely nothing about AVL trees.\nBut I did become frighteningly good at copy-pasting error messages. That\u0026rsquo;s the trap.\nLLMs can drag you to the finish line but when the next race starts, you\u0026rsquo;re limping.\nThe Real Cost of \u0026ldquo;Efficiency\u0026rdquo; # \u0026ldquo;This life is more than just a read-through…\u0026rdquo;\nIn industry, the higher you climb, the shorter your onboarding.\nAt some point, you\u0026rsquo;re expected to start shipping code the moment your setup script finishes running. That\u0026rsquo;s only possible if you\u0026rsquo;ve built intuition, that Chad-Smith-level rhythm where your hands know what to do before your brain catches up.\nAnd intuition only comes from pain. Academic exercises are the gym. Skip them, and you\u0026rsquo;ll pull a muscle the day you lift something real.\nWhen the Oracle Lies # \u0026ldquo;Psychic spies from China try to steal your mind’s elation…\u0026rdquo;\nAgentic LLMs are trained on data and data has boundaries.\nSo if you\u0026rsquo;re in a course that uses an obscure language or explores theory too new or too weird, the oracle goes silent. Sure, the Greeks said the Oracle got things wrong because of human arrogance.\nBut it\u0026rsquo;s still arrogance when we assume we know enough to let the machine think for us. I once asked an LLM to generate an automated proof. It gave me a perfect-looking answer ; that didn\u0026rsquo;t prove anything.\nWhen it failed, I broke the proof into subgoals and brute-forced my way through. Hours later, I had a \u0026ldquo;working\u0026rdquo; proof. But I\u0026rsquo;d learned nothing about proving.\nI\u0026rsquo;d just learned to outsource persistence. It\u0026rsquo;s like thinking you\u0026rsquo;ve mastered chess because Stockfish told you which move to play.\nYou win the game but you don\u0026rsquo;t understand why you won.\nThe Range Analysis Fiasco # \u0026ldquo;Throw away your television, make a break for better days…\u0026rdquo;\nThis semester, I tried it again. There was an assignment on range analysis.\nI thought, why not let an LLM write Kildall\u0026rsquo;s algorithm for me? It produced beautiful code better formatted than anything I\u0026rsquo;d write at 2 AM so I used it.\nGot a polite wrist-slap from the instructor. Lesson learned? None about Kildall.\nBut I did learn that shortcuts come with receipts. The right move would\u0026rsquo;ve been to write it myself, submit, get my points, and then compare my approach with the machine\u0026rsquo;s.\nPost-truth, not a priori.\nThe Punchline # \u0026ldquo;The more I see, the less I know, the more I’d like to let it go…\u0026rdquo;\nSo yeah, LLMs can accelerate your output. But growth doesn’t come from output, it comes from ownership. Every WA and TLE used to sting because they were mine. My mistakes, my blind spots, my late-night bugs that taught me patience and precision. Every hallucinated proof now feels like a thief in disguise, wearing my tone, borrowing my logic, but stealing the struggle that once made me sharper.\nThe truth is, I outsourced my curiosity. I let convenience masquerade as competence. Each time I asked the machine for an answer I could’ve wrestled with, I handed over a small piece of my craft. The worst part? It felt good. Efficient. Impressive even. Until one day I realized I hadn’t learned, I had only produced.\nUse the machine, but don’t let it use you. Because back then, failing meant learning. Now, if I’m not careful, learning just means asking better prompts and that’s not growth, that’s surrender dressed as progress.\nWhy This Is Dangerous # \u0026ldquo;Destruction leads to a very rough road, but it also breeds creation…\u0026rdquo;\nI still remember the first time I came across the phrase “segment trees with lazy propagation.” It sounded like wizardry, the kind of thing only people who truly got algorithms could whisper about with authority. Even today, I can talk about it, maybe even recognize its pattern and say, “ah, this problem smells like a segment tree.” But if you hand me a keyboard and ask me to implement it from scratch, I’ll almost certainly fumble.\nAnd that’s okay. Because learning something complex was supposed to hurt a little, it demanded time, rigor, and a healthy dose of failure. That discomfort was the tuition fee for understanding.\nBut with LLMs, you skip that pain. You can conjure a working implementation in seconds and it feels like mastery, but it isn’t. You haven’t built the muscle; you’ve borrowed it.\nThe Self-Roast # \u0026ldquo;I’m an ocean in your bedroom, make you feel warm…\u0026rdquo;\nOnce upon a time, I used modus ponens in casual conversation so often that my friends actually nicknamed me Modus Ponens. I thought it made me sound profound. Spoiler: it didn’t. It just made me unbearable.\nAnd here’s the cruel irony. In a recent meeting with a professor, I had the perfect chance to use my redemption arc :: \u0026ldquo;by Modus Ponens\u0026rdquo;, \u0026hellip; and I blanked. Completely. Months of outsourcing reasoning to machines had dulled my instincts so badly that even modus ponens, my old party trick, didn’t come naturally anymore.\nThis from someone who once represented his school in the CBSE Math Olympiad. Of course, in my current institute, where every other kid has an IMO gold medal or a unary rank, that’s hardly impressive, its not even worth mentioning. But that’s not the point. The point is: I’m going backwards.\nThis isn’t self-pity; it’s confession disguised as reflection. A plea for academic sincerity. Back in school, we mocked rote learning ,memorizing proofs, formulas, steps, calling it mechanical. But maybe there was hidden virtue in that repetition. Because repetition built recall, and recall built intuition. And you can’t reason if you can’t remember. Understanding isn’t just clarity; it’s also memory.\nOf course, there’s a sociological layer to this too: you can’t stay decent in indecent times - Harvey Dent. When everyone around you is using an LLM to ace their assignments or polish their research, it’s almost naïve not to. After all, the system rewards speed, not struggle. Companies care more about CGPA than craftsmanship.\nBut pause. Breathe. You might just be tumbling down a slope that looks like progress but isn’t. That’s when it hit me: I hadn’t just used the machine, I had become dependent on it.\nLLMs can generate code, but they can’t give you intuition. The code lives “out there,” not in you. After 112 retries, the machine will eventually cough up something that compiles, maybe even pull it from some forgotten corner of the internet. But those 112 failures should’ve been mine. Because if they were, the next time, it would take fewer. That’s how learning works , it amortizes, it compounds.\nOutput is not insight. And that’s the tragedy. I wanted to sound like a logician, but somewhere along the way, I turned into a copy-paste monkey in wannabe-academic robes.\nOn PhDs and Integrity # \u0026ldquo;Dream of Californication…\u0026rdquo;\nI often wonder if I’m really cut out for this PhD path. And honestly, the answer shouldn’t depend on brilliance or hard work, both are overrated currencies. The only question worth asking is: Can I be academically sincere?\nBecause patterns fade, passion burns out, but integrity, that’s the quiet engine that keeps running when everything else falls apart. If I choose this path, make sure it rests on that foundation. Not on borrowed brilliance. Not on synthetic understanding. But on integrity the kind that doesn’t need applause to persist.\nMaybe six months from now, I’ll be staring down a pile of failures. Maybe I’ll quit, and someday reread these words, a message from a younger version of myself who still believed he could do better. And yet, even then, I hope I’ll remember the rain in the first week of December. The way it falls in Chennai, like a conversation that refuses to end. How, somewhere between Marina’s waves and Besant’s cafés, life slows just enough for you to notice the salt on your skin, the weight of the evening, the small ache of wanting something unnamed.\nAnd if I do stay, if I keep walking this road, I hope I remember why. To learn sincerely, to rebuild slowly. Continuing in this path hopefully I mend my ways before it’s too late.\nA Cautionary Note # \u0026ldquo;All around the world, we can make time…\u0026rdquo;\nComputer Science isn’t just a career track anymore. It’s a responsibility. We’re not merely coding, we’re shaping the language of tomorrow’s machines, and maybe even tomorrow’s minds.\nBut if our foundations are built on hallucinated proofs and shortcut code, we’re no longer engineers. We’re gamblers, praying the machine rolls a lucky seven.\nSo here’s my parting note to myself and to anyone still reading :\nDon’t win the assignment and lose the career. # ","date":"October 19, 2025","externalUrl":null,"permalink":"/posts/how-agentic-llms-almost-destroyed-my-academic-career/","section":"Posts","summary":"“With great power comes great responsibility.”\nVoltaire (and also every Spider-Man movie ever)\nThere’s this video of Chad Smith, the drummer from Red Hot Chili Peppers. He’s hearing a song for the first time, no prep, no notes, no second take. And yet somehow, he just gets it. He catches the groove like it’s muscle memory, then makes the whole thing sound better.That’s the magic of practice. Not the kind where you count hours, but the kind where you repeat something so many times it becomes your second nature, your reflex. Whether it’s drumming, coding, or explaining your PhD topic to your relatives without crying, the idea’s the same: do it till it’s boring, and then keep doing it till it’s beautiful.\n","title":"How Agentic LLMs ~Almost~ Destroyed My Academic Career","type":"posts"},{"content":"","date":"October 19, 2025","externalUrl":null,"permalink":"/tags/programming/","section":"Tags","summary":"","title":"Programming","type":"tags"},{"content":"“Data races are bad.” — Every systems programming course, ever.\nBut why are they bad? And what exactly are they?\nLet’s be a little Aristotelian about this—question everything. So here we go:\nSo\u0026hellip; What Is a Data Race? # A data race occurs when two or more concurrent threads access the same memory location , at least one of them performs a write, and there\u0026rsquo;s no synchronization mechanism (like a lock or atomic operation) protecting that access.\nWhew. That’s a lot of words. Let’s unpack that.\n“Two or more threads” → We\u0026rsquo;re dealing with a multithreaded (or concurrent) program. A single-threaded program is too well-behaved to ever encounter a data race.\n“Same memory location” → Both threads are peeking (or poking) at the same variable.\n“At least one write” → If both threads are just reading, all’s peaceful. But the moment one tries to write—or update—the memory, conflict brews.\n“No synchronization” → No locks, no atomics, no coordination. It’s the memory access version of “every thread for itself.”\nPut all of that together, and you’ve got yourself a classic case of undefined behavior, kicking down the door and yelling: “You summoned me? Initiating FireRocket().”\nDekker’s Algorithm # Just two threads doing two things each.\nInitial Setup # We have two shared variables:\nint x = 0; int y = 0; And two threads:\nThread A: # x = 1; r1 = y; Thread B: # y = 1; r2 = x; That\u0026rsquo;s it. No mutex. No atomic. Just pure, unfiltered concurrent drama.\nWhat Should Be the Output? # Let’s think like a sequential human being.\nIf Thread A runs fully before Thread B:\nx = 1; r1 = y; // r1 = 0 ---------------- y = 1; r2 = x; // r2 = 1 =\u0026gt; r1 = 0, r2 = 1 If Thread B runs fully before Thread A:\ny = 1; r2 = x; // r2 = 0 ---------------- x = 1; r1 = y; // r1 = 1 =\u0026gt; r1 = 1, r2 = 0 If both writes happen before both reads:\nx = 1; y = 1; r1 = y; // r1 = 1 r2 = x; // r2 = 1 =\u0026gt; r1 = 1, r2 = 1 So our intuitive, reasonable human outputs are:\n(r1, r2) = (0, 1) (r1, r2) = (1, 0) (r1, r2) = (1, 1) All of these are fine and expected.\nBut Wait—A Quick Math Detour # Let’s get mathematical for a second. If you have two threads and each one performs two operations (a write and a read), the total number of interleavings of those operations is given by:\n$$ \\[ \\binom{2 + 2}{2} = \\frac{(2 + 2)!}{2! \\cdot 2!} = \\frac{4!}{2! \\cdot 2!} = \\frac{24}{4} = 6 \\] $$So there are 6 valid interleavings of the operations from Thread A and Thread B.\nThese correspond to permutations where the relative order of operations within each thread is preserved. That is:\nx = 1 before r1 = y (Thread A’s order preserved) y = 1 before r2 = x (Thread B’s order preserved) Let’s list a few examples:\nA1 A2 B1 B2 → (r1 = 0, r2 = 1) A1 B1 A2 B2 → (r1 = 1, r2 = 1) B1 B2 A1 A2 → (r1 = 1, r2 = 0) B1 A1 B2 A2 → and so on… These 6 interleavings give rise to our expected outputs:\n(0,1), (1,0), and (1,1).\nIn general :: For $n$ threads, where each thread performs $o_1, o_2, \\cdot o_n$ memory operations, $$ \\[ \\text{Total Interleavings} = \\frac{(o_1 + o_2 + \\ldots + o_n)!}{o_1! \\cdot o_2! \\cdots o_n!} \\] $$ So What’s the Problem? # That nasty little (r1, r2) = (0, 0) doesn’t appear in any of the above 6 valid interleavings!\nWhy? Because it would require reading both x and y before either write has occurred, which violates the intra-thread ordering unless reads were moved before writes.\nThis is why:\nReordering breaks the mathematical expectation based on interleavings and causes outputs that should be impossible under sequential consistency.\nSo while our math predicted 6 nice, clean outcomes, the CPU/compiler pulls a little abracadabra, breaks the sequential spell, and suddenly:\n(r1, r2) = (0, 0) But Here’s the Twist\u0026hellip; # Suppose your compiler or hardware chooses to reorder instructions, perhaps assuming it knows best. (For example, reordering a load $\\rightarrow$ a store, among other classic computer architecture behaviors—which I’ll delve into in a separate post.)\nImagine this:\nBoth threads reorder the read before write. Now, Thread A reads y (which is still 0), then writes x = 1. Simultaneously, Thread B reads x (also still 0), then writes y = 1. So we get:\nr1 = y; // 0 r2 = x; // 0 x = 1; y = 1; =\u0026gt; r1 = 0, r2 = 0 ← the \u0026#34;impossible\u0026#34; outcome! But that shouldn’t be possible, right?\nIf either thread had seen the other\u0026rsquo;s write, one of the reads should be 1.\nYet due to reordering and no synchronization, both reads saw 0.\nThis output:\n(r1, r2) = (0, 0) should be impossible under sequential consistency.\nBut under real-world hardware + compiler optimizations, it can happen.\nWhat is sequential consistency now? # Leslie Lamport defines sequential consistency as:\n\u0026ldquo;The result of any execution is the same as if the operations of all the processors were executed in some sequential order, and the operations of each individual processor appear in this sequence in the order issued.\u0026rdquo;\nTranslation: It should feel like all memory operations from all threads were executed in some sequential order, even if they were running concurrently.\n$$ \\forall a, b \\in S,\\ a \\leq b\\ \\vee\\ b \\leq a $$For any two processes $p$ and $q$ you can always determine which one comes before the other (or that they are the same).\nThis contrasts with partial orders, where some processes might be incomparable like concurrent processes in a distributed system with no causal link. This implies a total order of all operations. That is, everything can be lined up in a single straight line.\nIf you only had partial order (like Thread A and Thread B doing stuff independently), you\u0026rsquo;d never know which came first.\nWhich is great for eggs, terrible for memory consistency.\nDear Compiler: Please Stop Being So Smart # I mean, seriously. I wrote my code like this:\nx = 1; y = 2; I want you to do exactly that. Line by line.\nNo backflips, no clever tricks, no playing 4D chess with my instructions.\nI want you to translate it into assembly like a faithful scribe from an ecclesiastical journal—quietly, obediently, reverently.\nJust copy it down, convert into assembly, binary, the works!\nI’ve got the world’s biggest SSD, a GPU that can simulate Interstellar in 16K, and enough RAM to mine all the bitcoins—twice.\nOptimization is not my concern. I want correctness.\nSo what’s the problem?\nWell, here’s the catch: correctness isn’t the only thing you care about—even if you think it is.\nBecause guess what? Turning off every optimization doesn’t magically fix data races or memory consistency issues.\nThat bug you blamed on “compiler wizardry”?\nYeah, that’s on you, buddy.\nModern processors are sneaky little devils. They reorder instructions under the hood.\nThey prefetch, speculate, parallelize—all while laughing in binary.\nEven if the compiler leaves your code untouched, the hardware might still go, “eh, close enough.”\nSo unless you explicitly tell both the compiler and the CPU:\n“Hey, don’t mess this up—I mean it,”\nyou\u0026rsquo;re still rolling the dice.\nAnd if you think -O0 is your silver bullet?\nWell\u0026hellip; welcome to the illusion of safety.\nLet’s Make That Concrete # Here\u0026rsquo;s a simple-looking C++ example:\nint flag = 0; int data = 0; void writer() { data = 42; flag = 1; } void reader() { if (flag == 1) { printf(\u0026#34;%d\\n\u0026#34;, data); } } Two threads. writer() sets data, then signals with flag. reader() checks flag and reads data.\nLooks innocent, right? But this code has a data race.\nNo locks, no atomics, no memory fences.\nEven with -O0, the compiler is allowed to reorder instructions because C++ says: “you gave me undefined behavior, so all bets are off.”\nSo What Happens? # Maybe flag = 1 gets reordered before data = 42.\nMaybe reader() sees flag == 1, but reads stale or garbage from data.\nIt might run flawlessly on your machine, but crashes on the production server at 3AM—triggering a page and dragging you out of bed at an ungodly hour.\nFix It the Right Way # #include \u0026lt;atomic\u0026gt; std::atomic\u0026lt;int\u0026gt; flag{0}; int data = 0; void writer() { data = 42; flag.store(1, std::memory_order_release); } void reader() { if (flag.load(std::memory_order_acquire) == 1) { printf(\u0026#34;%d\\n\u0026#34;, data); } } This version uses proper synchronization:\nmemory_order_release in writer() ensures data is visible before flag is updated. memory_order_acquire in reader() ensures data is read after confirming flag. Bottom Line? # Correctness isn’t passive.\nYou don’t get it by turning knobs like -O0. You earn it by using the right tools—locks, atomics, memory barriers.\nBecause neither your compiler nor your CPU is in the business of babysitting your assumptions.\nWelcome to the real world!\nYou wanted a cup of water.\nThe compiler optimization will give you a caramel swirl, three skim, one sugar — and maybe some regret.\nSource\nIn the next part we will cover - Memory model of Programming Languages.\n","date":"April 6, 2025","externalUrl":null,"permalink":"/posts/data-races/","section":"Posts","summary":"“Data races are bad.” — Every systems programming course, ever.\nBut why are they bad? And what exactly are they?\nLet’s be a little Aristotelian about this—question everything. So here we go:\n","title":"Data Races","type":"posts"},{"content":"","date":"April 6, 2025","externalUrl":null,"permalink":"/tags/systems/","section":"Tags","summary":"","title":"Systems","type":"tags"},{"content":"Recently, The Hindu published an article that raised a pertinent question regarding the National Institutional Ranking Framework (NIRF): \u0026ldquo;Is quantity trumping quality?\u0026rdquo; As I write from the hallowed halls of one of India’s premier institutions—the Indian Institute of Technology, Madras—I find myself compelled to delve deeper into this issue. Are private institutions genuinely improving, or have they [sic] discovered ways to manipulate research metrics?\nIt is often said that data is the new oil, but just like oil, it needs refinement to be of any use.\nThe Illusion of Metrics # NIRF rankings have become a vital element of academic evaluation in India, but they are not without their controversies. One significant point frequently overlooked is the potential disconnect between research output and its true impact. To be clear, I am not comparing the fee structures of public institutions like Jadavpur University, which once had annual tuition as low as ₹3,000, with private institutions like VIT that charge significantly more. More funding certainly enables the creation of superior infrastructure. However, the question remains: How is the quality of academic work truly being measured?\nThis raises the question: Are we merely counting the number of publications, or are we genuinely evaluating their substance? The NIRF leans heavily on metrics such as the number of publications, faculty-student ratios, and other numerical indicators. But does this approach capture the true essence of academic excellence, or are we simply engaging in numerus stultorum—the counting of fools?\nA U.S. Parallel: Electoral Misrepresentation # For perspective, let’s consider the demographics of the United States. Population density is highest along the coasts, while the \u0026ldquo;flyover states\u0026rdquo; in the center are sparsely populated. However, voting power is not proportionate to population density—every person gets one vote. The result is that the fate of the country can be shaped by a smaller population in states like Wyoming, rather than by populous states like New York or California. Similarly, simplistic metrics in university rankings can distort the picture, giving undue weight to institutions that excel in quantity but fall short on quality.\nJust as the U.S. electoral system is criticized for its disproportionate distribution of power, the NIRF ranking system could be subject to similar critiques. The focus on a narrow set of metrics risks ignoring the multidimensional nature of academic success.\nThe Chen et al. Paper: MIT as a Confidence Marker # I read a paper by Chen, Manolios, and Riedewald titled Why Not Yet: Fixing a Top-K Ranking That Is Not Fair to Individuals here, the authors explore how omitting a top institution, such as MIT, from a ranking of U.S. universities in Computer Science could undermine confidence in the ranking system. The absence of such a heavyweight would signal flaws in the evaluation process.\nApplying this to India, if IITs suddenly disappeared from the upper echelons of the NIRF rankings, our confidence in the system would be shaken. While the rankings themselves may not be incorrect, they would suggest that the current metrics are inadequate. Chen and Manolios argue for factoring in more attribute-rich ranking process. These attributes can be anything ranging from industry partnerships, global research outreach, to international collaborations to offer a more holistic view of an institution\u0026rsquo;s true value.\nThe Problem with Simplified Metrics # It’s not just about the numbers; it’s about the meaning behind them. For example, publishing k papers in low-impact journals is not necessarily more valuable than publishing k paper in a prestigious, high-impact journal. Yet, NIRF often treats these two scenarios as comparable, favoring quantity over quality. A similar issue exists with U.S. News \u0026amp; World Report rankings, where factors like alumni donations and faculty salaries sometimes overshadow more meaningful indicators like student outcomes and teaching effectiveness.\nWe’ve all heard the adage “There are three kinds of lies: lies, damned lies, and statistics.” The danger of ranking systems lies in their reliance on simplistic metrics that only tell part of the story. A university might boast a high number of graduates, but if those graduates are poorly prepared for the workforce, what does that figure really signify? Similarly, an institution may publish a high number of research papers, but if those papers have minimal impact, can it truly claim to be advancing knowledge?\nThe faculty student ratio # As global demand for students rises and India accelerates its growth in manufacturing and aims for a stronger economy, the need for an increasing number of students becomes evident. NIRF places significant emphasis on the student-faculty ratio. In response, the need for highly qualified and in-demand professors is at an all-time high. However, a critical issue persists—there is a shortage of individuals pursuing careers in academia.\nWhile engineering remains a popular field of study, drawing large numbers of students, the academic profession is not witnessing the same level of enthusiasm, leading to a growing disparity. This imbalance, where more students are entering the field without a corresponding rise in faculty, could have serious consequences. If the NIRF continues to penalize institutions with low faculty-to-student ratios, it should serve as a wake-up call for both society and the government to incentivize and encourage more individuals to pursue academic careers.\nToward a More Nuanced Ranking System # What we need is a ranking system that reflects the full complexity of academic institutions. NIRF and other ranking bodies must move beyond binary, yes-or-no metrics. They need greater flexibility to account for the intricacies of academic output. More attention should be given to the quality of research, the societal impact of innovations, and the international reputation of faculty and alumni. While I am not suggesting that Western scholarships are a perfect metric, identifying reputable international conferences and factoring them into rankings would offer a clearer image of an institution\u0026rsquo;s standing.\nIn a world increasingly driven by data, the challenge lies in ensuring that the data tells the complete story—not just the parts that are easy to quantify.\n","date":"September 10, 2024","externalUrl":null,"permalink":"/posts/is-quantiy-trumping-quality/","section":"Posts","summary":"Recently, The Hindu published an article that raised a pertinent question regarding the National Institutional Ranking Framework (NIRF): “Is quantity trumping quality?” As I write from the hallowed halls of one of India’s premier institutions—the Indian Institute of Technology, Madras—I find myself compelled to delve deeper into this issue. Are private institutions genuinely improving, or have they [sic] discovered ways to manipulate research metrics?\n","title":"Is Quantity Trumping Quality in University Rankings? A Closer Look at NIRF and Beyond","type":"posts"},{"content":"","date":"September 10, 2024","externalUrl":null,"permalink":"/tags/politics/","section":"Tags","summary":"","title":"Politics","type":"tags"},{"content":" Introduction # For someone who has taken a course in Computer Science, they have probably come across a B+ tree, often used in the context of databases for storing data. A B+ tree schematically looks like this:\n[ 1003 ] / | \\ [1001] [1002] [1004 1005] [1007] / | | | [Naruto] [Sasuke] [Sakura Hinata Kakashi] [Itachi] In a B+ tree, the data always lies in the leaf nodes.\nHowever, what are the advantages of using such a structure in a database? Considering that data always resides on disk, how are disk operations optimized? Where does the OS come into play in optimizing queries? Ultimately, running a SQL application involves executing a binary like say ./sql, which runs in RAM. So, how does this binary interact with the disk, seek through it, return to RAM, perform calculations, and fetch more data from the disk? These questions prompted me to explore deeply what happens when you execute:\nSELECT * FROM table WHERE id=\u0026#39;122\u0026#39;; Setup # Let\u0026rsquo;s consider a database with the following structure:\nStudent ID | Name | Marks ------------------------------ 1001 | Naruto | 85 1002 | Sasuke | 72 1003 | Sakura | 90 1004 | Hinata | 78 1005 | Kakashi | 88 1006 | Shikamaru| 65 1007 | Itachi | 92 1008 | Gaara | 81 1009 | Rock Lee | 70 1010 | Neji | 87 In the disk, the data is stored in a B+ tree as follows:\n[ 1003 ] / | \\ [1001] [1002] [1004 1005] [1007] / | \\ | | [Naruto] [Sasuke] [Sakura Hinata Kakashi] [Itachi] When creating the table, we inform the query engine which column is our indexed key. The index key is used to navigate through the B+ tree. Remember, the values are always in the leaf nodes, so we must traverse through the B+ tree.\nTraversal # So, what happens when the query is executed? The OS needs to know where to locate the root of the B+ tree. Usually, this information is kept in a System Catalog (Metadata). This catalog includes information about tables, columns, data types, constraints, and indexes. In our minimal example, the table type and the indexed student_id are stored in the catalog. The system catalog includes pointers or references to the index files.\nOnce the root node is located on the disk, it is loaded into RAM, and the execution starts for the engine.\nRoot Node (in RAM): Keys: [1004] Pointers: [pointer_to_child_1, pointer_to_child_2] Child Nodes (on disk): Node 1 (1001, 1002, 1003) Node 2 (1005, 1006, 1007) Here\u0026rsquo;s the refined section with added details, and data on why disk seek is more costly than RAM calculations.\nSequence Diagram # When the RAM loads the B+ tree node, it decides which child node to access next, requiring a disk operation. This seems inefficient, doesn\u0026rsquo;t it? Each decision necessitates a trip to the disk to fetch the next node, which is time-consuming. Any Computer Science graduate knows that RAM operations are much faster than disk operations. Even if one is not a CS graduate, it is intuitive—disk operations involve mechanical components moving the head to the correct memory address, introducing inertia.\nClearly, this approach is superior to a linear search where each Student ID is loaded into RAM, compared, discarded, and then the next block is loaded and searched. A linear search would involve one load to RAM and one comparison per node. Instead, the system brings the top k nodes of the hierarchy into RAM, leaving the leaf nodes on the disk. Instead of searching the entire leaf node, the system filters decisions at the RAM level using pointers to determine which subtree may contain the desired index.\nFor instance, in a large tree with 20+ levels:\nLevel 1 (RAM): [Root Node] Level 2 (RAM): [Internal Nodes] Level 3 (RAM): [Internal Nodes] Level 4-20 (Disk): [Internal Nodes] ... [Leaf Nodes] This significantly reduces the time required for the search. Additional techniques like cache eviction and LRU (Least Recently Used) strategies can further optimize performance. However, the overarching idea remains the same.\nFetching from RAM to Disk # Below is a representation of how data is fetched from RAM to disk:\n+------------------------------------+ | RAM | +------------------------------------+ | Root Node [1003] | | Internal Node 1 [1001, 1002] | | Internal Node 2 [1004, 1005] | +------------------------------------+ | | v v +----------------+ +-----------------+ | Disk | | Disk | +----------------+ +-----------------+ | Leaf Nodes | | Leaf Nodes | | [Naruto, Sasuke] | [Sakura, Hinata]| +----------------+ +-----------------+ What if the column is not INDEXED # When a column is not indexed, the database must perform a linear search to locate the desired data. This involves scanning each row in the table sequentially, which is highly inefficient, especially for large datasets. In contrast, an indexed column allows the database to quickly locate data using a structured B+ tree, significantly reducing search time. The WHERE clause on an indexed column leverages the B+ tree\u0026rsquo;s hierarchy to rapidly navigate to the relevant data, minimizing disk access and computational overhead. Therefore, queries on indexed columns execute much faster than those on non-indexed columns, highlighting the critical role of indexing in database performance optimization.\nConclusion # B+ trees are essential for optimizing SQL query performance through efficient data organization and indexing. By leveraging the speed of RAM for intermediate node processing and minimizing disk access to leaf nodes, B+ trees significantly enhance query execution times. This hierarchical approach reduces costly disk seeks, capitalizing on the rapid access speeds of RAM for decision-making. Techniques like cache eviction and LRU further improve efficiency by keeping frequently accessed nodes readily available. This deep dive into B+ tree mechanics underscores the importance of indexing in modern databases, demonstrating how foundational computer science concepts can solve practical problems, ensuring databases remain robust, efficient, and responsive.\n","date":"June 11, 2024","externalUrl":null,"permalink":"/posts/b+-trees-sql-optimization/","section":"Posts","summary":"Introduction # For someone who has taken a course in Computer Science, they have probably come across a B+ tree, often used in the context of databases for storing data. A B+ tree schematically looks like this:\n","title":"How B+ Trees Optimize SQL Queries: A Primer","type":"posts"},{"content":"Preface: Rishi Durvasa, known for his irascible nature, was infamous for his ability to curse. As mythology suggests, Durvasa visited Kanav Rishi\u0026rsquo;s ashram, and Shakuntala was lost in her daydreams of Dushyant. Furious, Durvasa cursed Shakuntala, saying that the one she dreamed of would forget her when the time came. This is a modern retelling of that ancient story.\nThe ceiling fans at Kanav Café whirred lazily above, barely stirring the humid afternoon air. Durvasa sat at his usual corner table, nursing a black coffee, his eyes flicking between the window and his notebook. His mind, however, was far from the café, lost in the endless research spiral. He was close—tantalizingly close—to finding a cure for a rare form of Alzheimer\u0026rsquo;s that had haunted him for years. The disease crept like a thief, stealing memories and identities, stripping people of their past. But Durvasa, with his relentless mind, was determined to stop it. Or, perhaps, rewrite it.\nHis concentration broke as the bell above the café door jingled. Shakuntala entered with her usual burst of energy, her laughter rippling through the air like a melody he couldn’t tune out. Her hair was loosely tied back, and she had that radiant glow she always carried after a long morning spent on the phone with Dushyant. She spotted Durvasa and waved, her eyes sparkling as she made her way over with her familiar chatter.\n\u0026ldquo;You won\u0026rsquo;t believe what Dushyant said to me today,\u0026rdquo; she began without preamble, sliding into the chair opposite him. \u0026ldquo;He was so sweet yesterday. We went to this little bookshop—you would have loved it! I think he picked out this book on mythology, something about the Greek goddess Athena. Anyway, we spent hours talking about life, the future, and even marriage.\u0026rdquo; She sighed dramatically, her eyes alight with memories she cherished.\nDurvasa smiled tightly, lifting his cup to his lips. This was their routine—she would talk about Dushyant, and he would listen with half an ear while his thoughts wandered back to his lab. But today, something in him stirred differently. A quiet storm brewed beneath his calm exterior, a dark resolve taking root.\n\u0026ldquo;I’m serious, Durvasa,\u0026rdquo; she continued, \u0026ldquo;he was just\u0026hellip; perfect. We even talked about raising children. Can you imagine? Little Dushyants running around!\u0026rdquo; She laughed, unaware of the quiet rage flickering in Durvasa’s heart.\nDurvasa nodded absentmindedly, feeling the weight of the vial in his pocket—the culmination of months of obsessive research. Shakuntala had no idea, but she had been the unwitting inspiration behind his latest breakthrough. He had spent sleepless nights pondering a question that twisted his heart with every thought of her: “What if you could preserve the memories that mattered most? What if you could forget the pain, the rejection, the unrequited love\u0026hellip; but hold on to the joy, the warmth, the dreams?”\nHer words faded into the background as Durvasa’s mind raced. He had found the answer in a drug, a memory-preserving solution that could forever etch certain moments into the mind. And Shakuntala, unknowingly, was about to receive his modern curse.\nShakuntala reached for her cup, but Durvasa stopped her. “Wait,” he said softly, almost too casually. \u0026ldquo;I think your almond croissant is here.\u0026rdquo; She stood up to fetch her order, and in that moment, a flood of bitterness surged through Durvasa’s veins. His mind flashed with images of her smile, laughter, and endless stories about Dushyant—stories that would never be about him. A decision darker than any curse uttered by the rishi of old formed in his mind.\nWhen Shakuntala returned, he reached into his pocket, pulling out the small vial with a dropper. He added a few drops to her coffee with a casual smile. “I’ve been working on something new,” he said smoothly. “A little\u0026hellip; enhancer. It’ll make your coffee better, I promise.”\nShe raised an eyebrow, intrigued. “Really? Alright, Dr. Durvasa. Hit me with your magic.”\nHe smiled faintly as she took a sip. She grimaced at the bitterness but smiled at the aftertaste. “Tastes\u0026hellip; different. Stronger.”\nDurvasa watched her, a mixture of guilt and grim satisfaction tightening in his chest. He knew what he had done—he had taken a risk. For years, he had listened to Shakuntala speak passionately about Dushyant. For years, he had admired the depth of her love for another. Yet, there was something quietly tragic, even maddening, about her endless devotion. And why not him? Why did her heart always belong to someone else?\n“Tell me more about him,” Durvasa urged gently, his voice almost distant.\nShe smiled, her eyes lighting up with that same familiar glow. “Oh, Durvasa, he’s just\u0026hellip; perfect. The way he looks at me, the way he holds me\u0026hellip; it’s like nothing else matters. I feel so safe with him.”\nHer words drifted into the air, but Durvasa’s heart twisted with a sharp, bitter pain as he realized the full weight of what he had done. She would never forget Dushyant. No matter how much time passed, no matter what joys or sorrows life brought her, his face would be seared into her mind, unyielding, like a ghost that refused to fade. And yet, that memory would be her torment, because Dushyant would forget her—he was destined to.\nDurvasa\u0026rsquo;s thoughts darkened, consumed by a venomous rage. Dushyant will forget you, he thought coldly. He comes from a family of wealth, of status, where love is just a fleeting fancy to be discarded when it becomes inconvenient. His world will swallow him whole—the expectations of his lineage, the pressure to marry into wealth, to preserve the image of the perfect life. While your father was a humble social worker, a priest at the local temple, a friend to all, and your mother a modest dancer—people whose union was barely tolerated even within their own modest circles—how could you ever believe that your love with one of the city\u0026rsquo;s wealthiest sons would ever be welcomed?\nThere was a cruel smile playing on his lips now, a reflection of the storm raging in his heart. For him, you are just a passing chapter, a secret thrill in the shadow of his real life. But for me\u0026hellip; Durvasa\u0026rsquo;s hands clenched around his cup, his knuckles white. For me, Shakuntala, you were always the \u0026lsquo;one.\u0026rsquo;\nA bitter laugh escaped him, laced with venom and sorrow. \u0026ldquo;I will live my life haunted by your thoughts,\u0026rdquo; he whispered, almost as if confessing to himself. \u0026ldquo;Every smile, every glance, every word you spoke of him\u0026hellip; it will haunt me for a lifetime. I will carry it with me like a festering wound that will never heal.\u0026rdquo;\nHe looked at her, sitting across from him, still unaware of the dark magic that had been sealed into her mind. And then the cruelty in his voice sharpened. But now, with this elixir, this curse\u0026hellip; his thoughts hissed. You will share that burden. You will live your life thinking about someone who can never be yours. As Dushyant moves on, builds a life, raises sons, finds a woman his world accepts\u0026hellip; you will be left with the memory of him, forever etched into your soul. A love that is unreachable, untouchable, because he will never be yours.\nThe weight of his words—unspoken but burning in his mind—settled like a heavy shadow. In this modern world, the curse of love was no longer a matter of gods and sages, but a concoction of science, a poison masked as a gift. Durvasa knew what he had done was more than cruel. And in this final act of his devotion, he had sealed both of their fates: his, to a life of quiet torment, hers, to an eternal longing for a love that would never return.\n","date":"April 12, 2024","externalUrl":null,"permalink":"/posts/echoes-of-a-timeless-curse/","section":"Posts","summary":"Preface: Rishi Durvasa, known for his irascible nature, was infamous for his ability to curse. As mythology suggests, Durvasa visited Kanav Rishi’s ashram, and Shakuntala was lost in her daydreams of Dushyant. Furious, Durvasa cursed Shakuntala, saying that the one she dreamed of would forget her when the time came. This is a modern retelling of that ancient story.\n","title":"Echoes of a Timeless Curse","type":"posts"},{"content":"","date":"April 12, 2024","externalUrl":null,"permalink":"/tags/literature/","section":"Tags","summary":"","title":"Literature","type":"tags"},{"content":"Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.\nBefore you think I\u0026rsquo;m crowning Lisp as God\u0026rsquo;s own language after just one blog stint at a coding exercise, hold your horses. I\u0026rsquo;m not here to bash Object-Oriented Programming or its design patterns. In fact, I believe it\u0026rsquo;s crucial to know these patterns inside out. Only then can you play the game of \u0026lsquo;Design Pattern or Anti-Pattern?\u0026rsquo; with any confidence. Remember, these musings are all from my little corner of the world and don\u0026rsquo;t reflect the hard work of other developers who’ve been sweating over StarPlat.\nNow, it\u0026rsquo;s not every day that I wake up with a burning desire to write a major mode for Emacs. Most programming languages already have their major modes written by far more ambitious souls. However, in my journey at IIT Madras I stumbled upon a DSL called StarPlat, which didn\u0026rsquo;t have its own major mode in Emacs. So, I thought, \u0026lsquo;Why not?\u0026rsquo; and set out to create one, partly for the challenge, partly for my own learning.\nThe syntax highlighting looks something like this: Syntax Highlighting: In Emacs Lisp, we use font-lock-defaults, a buffer-local variable. It needs a list that describes the syntax of the language for which the major mode is set. Here\u0026rsquo;s how it looks for StarPlat: (defvar starplat-font-lock-keywords (let* ( ;; Define the list of keywords (keywords \u0026#39;(\u0026#34;if\u0026#34; \u0026#34;else\u0026#34; \u0026#34;return\u0026#34; \u0026#34;function\u0026#34; \u0026#34;for\u0026#34; \u0026#34;while\u0026#34; \u0026#34;in\u0026#34;)) ;; Define the list of built-in-functions (built-in-funcs \u0026#39;(\u0026#34;iterateInBFS\u0026#34; \u0026#34;iterateInReverse\u0026#34;)) (data-types \u0026#39;(\u0026#34;int\u0026#34; \u0026#34;float\u0026#34; \u0026#34;string\u0026#34; \u0026#34;bool\u0026#34; \u0026#34;SetN\u0026#34; \u0026#34;SetE\u0026#34;)) ;; accumulate regexp strings for each keyword category (keywords-regexp (regexp-opt keywords \u0026#39;words)) (data-types-regexp (regexp-opt data-types \u0026#39;words)) (built-in-funcs-regexp (regexp-opt built-in-funcs \u0026#39;words))) `( (,keywords-regexp . font-lock-keyword-face) (,data-types-regexp . font-lock-type-face) (,built-in-funcs-regexp . font-lock-builtin-face) ;; TODO: add further patterns that require syntax highlighting ))) Reference: various syntax highlighting options\nLet\u0026rsquo;s breakdown what is happening here -\nIn Emacs Lisp, a pair of values is often represented as a cons cell. Here, (,keywords-regexp . font-lock-keyword-face) is a cons cell where ,keywords-regexp is the car (the first element) and font-lock-keyword-face is the cdr (the second element).\nIn detail:\nfont-lock-keyword-face: This is one of the predefined faces in Emacs for syntax highlighting (font-locking in Emacs terminology). Emacs uses it by default to highlight keywords.\n,keywords-regexp: The comma character at the beginning signals that keywords-regexp is a symbol that needs to be evaluated. The rules of keywords-regexp have been neatly highlighted in the let* block.\n┌─────────────┐ │ │ │ Bag of Words│ │ │ │ │ └──────┬──────┘ │ ┌─────────────┼─────────────┐ │ │ │ Regex Match ┌─┴─┐ ┌─┴─┐ ┌─┴─┐ │ │ │ │ │ │ │Key- │Built-in │Data- │words │ types │types │ │ │ │ │ │ └─┬─┘ └─┬─┘ └─┬─┘ │ │ │ Transformation └─────────────┼─────────────┘ │ ┌──────┴──────┐ │ │ │ Bag of Words│ │ │ └─────────────┘ This particular list, can be a list of keywords, data-types, built-in-funcs Reference: extensive documentation on how to use font-lock-defaults and related functions.\nImagine you\u0026rsquo;re teaching a black box to play \u0026lsquo;color by numbers\u0026rsquo; with words. You need to:\na. Define the pattern (the number). b. Assign a color to each pattern. c. Watch as the black box colors your words accordingly. This is essentially the famous \u0026lsquo;Interpreter Pattern\u0026rsquo;, but in Lisp, it feels like you\u0026rsquo;re just casually tossing words into a magic cauldron and watching them transform. It\u0026rsquo;s a smooth, almost effortless approach to syntax highlighting that showcases Lisp’s elegance and power.\n","date":"January 6, 2024","externalUrl":null,"permalink":"/posts/major-mode-el/","section":"Posts","summary":"Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.\nBefore you think I’m crowning Lisp as God’s own language after just one blog stint at a coding exercise, hold your horses. I’m not here to bash Object-Oriented Programming or its design patterns. In fact, I believe it’s crucial to know these patterns inside out. Only then can you play the game of ‘Design Pattern or Anti-Pattern?’ with any confidence. Remember, these musings are all from my little corner of the world and don’t reflect the hard work of other developers who’ve been sweating over StarPlat.\n","title":"Major Mode El","type":"posts"},{"content":"Am I too comfortable or am I dying from within\nI wish at the word go I could drop my skin\nThe quotes of the dead do not interest me\nNor does the aphorism of the sickened living\nI am still searching for lost letters and their meaning\nTo go back to the gardens of my uninterrupted musing\nAnd every night I hope to find the old you;\nAnd every day I hope for a new story to begin.\n","date":"May 30, 2023","externalUrl":null,"permalink":"/posts/the-circle-of-incompleteness/","section":"Posts","summary":"Am I too comfortable or am I dying from within\nI wish at the word go I could drop my skin\nThe quotes of the dead do not interest me\nNor does the aphorism of the sickened living\n","title":"The Circle of Incompleteness","type":"posts"},{"content":"Happy Birthday to Me: Musings on Language and Code\nToday, I find myself celebrating my birthday in the tranquil confines of a cozy Airbnb in Vancouver. It’s a momentary escape from the pressures of work, a rare breather amidst the ever-present demands of the office. Through the frost-touched window, I watch the Canadian flag battle the icy winds. Below, I can hear the faint murmurs of a French-speaking couple, my neighbors. A little later, the cadence of Mandarin reaches my ears, no doubt my landlord going about his day. And then, the phone rings. It\u0026rsquo;s my mother, calling from home, so naturally, the conversation begins in Bengali. When my father takes the phone, we switch seamlessly to Hindi. After the call, I set the phone down, open my laptop, and begin composing an email in English.\nAs the email closes, I dive into writing code, shifting once again—but this time into the realm of Java. Before long, I realize the data needs some cleaning up, so Python it is. A little infrastructure work follows, and suddenly I’m typing away in TypeScript.\nAnd so I ask myself: why so many languages? Why do we burden ourselves with this linguistic and computational multiplicity? The world has long sought a universal tongue. Philosophers and scientists alike have proposed solutions, from the constructed language of Esperanto to the formalized structure of Loglan. Yet here we remain, in a world where diversity of language reigns, and we must transition from one to another at a moment\u0026rsquo;s notice.\nThis rumination takes me back to a particularly memorable lesson from my programming languages (PL) class, where my professor once remarked, \u0026ldquo;Semantics are greater than syntax.\u0026rdquo; Semantics, the meaning behind the words and symbols, form the bedrock of all communication, be it in human languages or in code. Syntax, the structure and form, is what gives language its recognizable shape, but it is the semantics that give it life.\nLet’s stretch this notion beyond programming and into everyday speech. Take Latin, for example: an ancient language from which so many modern tongues have evolved. For nearly every Latin term, there exists an English counterpart. Suo Moto becomes “on its own motion,” Ad Hominem transforms into “to the person.” Each word, though different in form, carries with it the same weight of meaning—the same semantics. The symbols may change, but the message remains.\nYet, in the legal world, Latin has emerged as the lingua franca, a curious decision for a modern field. One might argue that Latin’s dominance stems from its ancient roots, with much of Western legal tradition borrowing heavily from Roman law, making Latin a natural choice. However, this reasoning feels like a mere case of \u0026ldquo;following the herd\u0026rdquo; or \u0026ldquo;going with the flow\u0026rdquo;—a tendency to adopt something just because it has been done before. There is a deeper explanation, I believe, rooted not in convenience but in the allure of the precision Latin offers.\nConsider Japanese, a language rich with words that have permeated the world of business and management. It\u0026rsquo;s not as if other languages lack equivalents for terms like Ikigai (reason for being) or Kaizen (continuous improvement), or even Shoshin (beginner\u0026rsquo;s mind) and Gemba (the actual place). These words evoke a sense of discipline and mindfulness that transcends their simple translations. English may offer approximations, but the adoption of these Japanese terms in global contexts speaks to something greater than just a linguistic gap. It\u0026rsquo;s about the cultural weight behind the words—the ethos that underpins their meaning.\nThe same phenomenon occurs when I turn to the world of emotions and the arts. A sad poem can be written in English or Hindi, but when heartache strikes deeply, I turn to the lyrical beauty of Bengali or Urdu / Hindustani for solace. The tender phrases, the soft melancholy of Urdu literature, seem to resonate with the human condition in a way that no mere translation can capture. Personally, I find myself drawn to Tamil songs when in need of comfort, though I must often search for their English translations. And Bengali is a complete universe in its own. I just sometimes feel lucky to be born in a bengali family, to know enough bengali to not search meanings of the word by word but rather need to look up for one word in a sentence. (Ya, its bad but more manageable than say Spanish or Tamil). Even with the meaning intact, the experience feels incomplete; the original language carries a certain emotional depth that translations simply cannot map one-to-one.\nThis, I think, perfectly illustrates the difference between syntax and semantics. You can change the words, but the true meaning—the emotional or cultural resonance—may not always follow.\n// In C for (int i = 0; i \u0026lt; size; i++) { printf(\u0026#34;%d\\n\u0026#34;, list[i]); } // In C++ for (int val : vec) { std::cout \u0026lt;\u0026lt; val \u0026lt;\u0026lt; std::endl; } // In Scala // Using foreach to traverse list.foreach(println) // Alternatively, you can use a simple for loop for (elem \u0026lt;- list) { println(elem) } // In Racket (define (my-functor f l) (map f l)) (my-functor (lambda (x) (displayln x)) list) // In Brainf*ck +[-\u0026gt;,----------] ; Reads input until 0 (input ended) \u0026gt;++++++++++++[-\u0026gt;+++++++++++++\u0026lt;]\u0026gt;+++. ; Output first value (65 -\u0026gt; A) Programming languages exhibit a vast range of syntactic and semantic differences, and these differences play a crucial role in how various constructs, such as list traversal, are executed. This variance raises essential questions about the orthogonality of programming languages—specifically, their capacity to support multiple paradigms for achieving similar tasks. For instance, does a language restrict itself to traditional constructs like the for loop, or does it embrace functional programming paradigms, offering operations like map, list comprehensions, or functors?\nThe primary objective in list traversal remains constant across languages, yet the choice of method alters the semantics of the operation. A for loop is often more familiar to many programmers, providing clarity and ease of understanding, while functional approaches, such as a map function with a lambda expression, offer a more elegant and concise method. These functional paradigms tend to generate more robust, maintainable code. Despite both methods being syntactically valid, the choice between them has broader implications for readability, flexibility, and a language’s capacity to facilitate more complex, functional patterns.\nThis brings us to an important realization: the popularity of a programming language often does not hinge solely on its syntax. A language\u0026rsquo;s success is often driven by its semantics—the underlying meaning behind its constructs—and the ecosystem it fosters, including libraries, frameworks, and tools. Semantics, in this context, are far more critical than simply offering numerous syntactic options. The strength of a language lies in how clearly and efficiently it can express complex ideas, rather than how many ways it allows one to express the same task.\nAt the heart of this discussion is a critical question: Is it possible to design a language where the syntax is intuitive, yet the underlying semantics are versatile enough to accommodate diverse use cases? Can we create a programming language that is not merely a tool but a medium—one that is expressive, relatable, and capable of encapsulating the nuances of the problems it aims to solve?\nThe challenge in language design lies in achieving a balance between generality and specificity. A language designed for broad applicability needs to offer both flexible syntax and deep semantic richness, whereas a domain-specific language might prioritize strict syntactic rules to ensure precision and correctness. The ultimate goal is to find a middle ground—a language that offers enough expressive power and robust semantics to handle complex problems while remaining accessible and intuitive to a wide range of developers.\n","date":"March 24, 2023","externalUrl":null,"permalink":"/posts/language-semantics/","section":"Posts","summary":"Happy Birthday to Me: Musings on Language and Code\nToday, I find myself celebrating my birthday in the tranquil confines of a cozy Airbnb in Vancouver. It’s a momentary escape from the pressures of work, a rare breather amidst the ever-present demands of the office. Through the frost-touched window, I watch the Canadian flag battle the icy winds. Below, I can hear the faint murmurs of a French-speaking couple, my neighbors. A little later, the cadence of Mandarin reaches my ears, no doubt my landlord going about his day. And then, the phone rings. It’s my mother, calling from home, so naturally, the conversation begins in Bengali. When my father takes the phone, we switch seamlessly to Hindi. After the call, I set the phone down, open my laptop, and begin composing an email in English.\n","title":"Language Semantics","type":"posts"},{"content":"In the House of Commons of Great Britain, Edmund Burke once stated, \u0026ldquo;There were Three Estates in Parliament; but, in the Reporters’ Gallery yonder, there sat a Fourth Estate more important than them all.\u0026rdquo; As the 21st century dawned, we observed media houses displaying biases towards their respective lobbies. The \u0026rsquo;truth,\u0026rsquo; which should be absolute, now has shades of well-crafted absolution. \u0026ldquo;Whose truth is the truth?\u0026rdquo; is a question I often ask myself while reading news articles on the internet. With the government in power controlling the first three estates, it\u0026rsquo;s not an exaggeration to say that Orwell\u0026rsquo;s \u0026ldquo;1984\u0026rdquo; no longer seems like fiction. This blog explores the hypothetical \u0026ldquo;00 Estate\u0026rdquo; that encompasses all estates and dictates the terms for writing code.\nI recently watched an inspiring talk by Uncle Bob on the future of programming, where he stressed the importance of writing \u0026lsquo;good code\u0026rsquo; and the dangers of neglecting it. The \u0026rsquo;neglect\u0026rsquo; part is particularly alarming. With the world leaning towards automation, dependency on machines has increased. Ideas like voice-activated control of fans, lights, and televisions are now common in households. What once belonged in science fiction movies is now reality, all powered by billions of lines of code. Therefore, if we, the software engineering community, don\u0026rsquo;t prioritize code quality, the repercussions could be severe. Faulty code has caused loss of life due to machine failures, automobile hacking, privacy breaches, and more.\nSoon, the government may regulate what code is used in machines, given the stakes involved in public safety. The problem lies in the fact that policymakers often lack sufficient knowledge about software. These individuals are elected based on their experience in public policies, not their technological expertise. Unfortunately, the code we write today deeply impacts daily life, merging into a single operational concept. In such cases, these elected officials may seek expertise from external parties. Outsourcing decision-making about software can open a Pandora\u0026rsquo;s box of problems. The law of diminishing returns suggests that adding a flawed system to fix another can worsen the situation. Granting an organization the power to label software as \u0026lsquo;good\u0026rsquo; or \u0026lsquo;bad\u0026rsquo; introduces hierarchy, which undermines the egalitarian nature of the software industry. For instance, organizations publishing \u0026lsquo;best practices\u0026rsquo; in books, blogs, and whitepapers tend to offer suggestions rather than directives.\nEarlier this year, I believed software quality was measured solely by the number of test cases or code coverage. However, it\u0026rsquo;s also about how software performs during outages and maintaining test suites. Concepts like chaos testing are gaining traction, with defined roles in scalability, load, performance, and configuration testing for SDET positions. Writing good code is crucial, especially if government intervention dictates not just coding practices but also allows backdoor access for governments and organizations. Edward Snowden has discussed the dangers of over-surveillance if an organization decides to track every user. I also believe that success stories, like those emerging from Harvard dorms to become billion-dollar companies, may become rare. Ironically, even a democratic system can stifle the democratization of software in unimaginable ways.\nIt\u0026rsquo;s time for software engineers worldwide to commit to writing good code. It\u0026rsquo;s not just about avoiding poor performance, but also about facing the potential for major catastrophes that could bring sweeping regulations.\n","date":"February 15, 2023","externalUrl":null,"permalink":"/posts/the-00-estate/","section":"Posts","summary":"In the House of Commons of Great Britain, Edmund Burke once stated, “There were Three Estates in Parliament; but, in the Reporters’ Gallery yonder, there sat a Fourth Estate more important than them all.” As the 21st century dawned, we observed media houses displaying biases towards their respective lobbies. The ’truth,’ which should be absolute, now has shades of well-crafted absolution. “Whose truth is the truth?” is a question I often ask myself while reading news articles on the internet. With the government in power controlling the first three estates, it’s not an exaggeration to say that Orwell’s “1984” no longer seems like fiction. This blog explores the hypothetical “00 Estate” that encompasses all estates and dictates the terms for writing code.\n","title":"The 00 Estate: What Happens When the Government Dictates How to Write Code?","type":"posts"},{"content":"A trainee undergoing military training can disassemble and assemble a machine gun within a minute. At first, this might seem very complex, but everyone in the academy manages to do it. The more pertinent question is not \u0026lsquo;how\u0026rsquo; but \u0026lsquo;why\u0026rsquo;. It\u0026rsquo;s because their lives depend on it. Similarly, a disciplined programmer learns the vocabulary and syntax of a programming language with utmost sincerity. Every word in the programming language is sacred, and any non-conformance is akin to blasphemy.\nPerhaps the mind sees what it chooses to see, and I am trying to draw parallels deliberately, even if there aren\u0026rsquo;t any. However, I do see a lot of similarities between parenting in an Eastern cultural environment and the art of disciplined development. Interestingly, the phases of learning and the four stages of Hindu life share commonalities.\nBrahmacharya (student life): In this phase, you need to understand the vocabulary of the language and practice. Practice as much as possible to ensure that everything about the programming language is permanently ingrained in your brain. With extensive practice, you start recognizing patterns that the untrained eye cannot see. In David Epstein\u0026rsquo;s book \u0026lsquo;Range\u0026rsquo;, there\u0026rsquo;s an anecdote about a well-known grandmaster, possibly Judith Polgar, who can recreate a chess game scene by looking at the board for a few seconds. However, if the pieces are arranged in a way that results in an impossible position or doesn\u0026rsquo;t conform to the laws of chess, then the grandmaster struggles to recreate the same position. Conclusion: the more you practice, the more patterns you can subconsciously recognize.\nGrihastha_ (household life): Once you know the basic vocabulary of the programming language and have experimented with a few projects, it\u0026rsquo;s time to seek a community. The best places for this are \u0026lsquo;GitHub\u0026rsquo;, hackathons, and meetups. If your area, college, or city lacks a programming \u0026lsquo;family\u0026rsquo;, create one. This phase is essential as you gain a sense of responsibility for the code you author. Engaging with major open-source projects and resolving issues gives a sense of belonging to a community.\nVanaprastha (retired life): Though \u0026lsquo;retirement\u0026rsquo; might be misleading here, I\u0026rsquo;ll try to fit the idea. This phase is for introspection. You ponder why one paradigm is better than another, or why a specific technology is well-suited for a particular problem. These existential questions are best answered in solitude. If these questions are asked too early in a developer\u0026rsquo;s journey, it leads to confusion. Every programming language has its strengths and weaknesses. Even a language like PHP, often criticized, has significant advantages. Diving into these questions before fully understanding the language can prevent a developer from appreciating the nuances of their chosen language.\nSannyasa (renounced life): Only when you fully understand the constructs of a language are you ready to exploit its paradigms to suit your needs. In the final phase, a disciplined programmer understands the discipline so well that they can experiment with established rules for deeper comprehension and enlighten others in the same field.\nUntil then, follow what is prescribed in the textbooks. You don\u0026rsquo;t have the luxury of iterating over a collection in your own way; you must perform actions as the textbook suggests. Yes, your learning curve may seem to plateau, but you will start to appreciate the science of learning.\n","date":"January 15, 2023","externalUrl":null,"permalink":"/posts/disciplined-paradigms/","section":"Posts","summary":"A trainee undergoing military training can disassemble and assemble a machine gun within a minute. At first, this might seem very complex, but everyone in the academy manages to do it. The more pertinent question is not ‘how’ but ‘why’. It’s because their lives depend on it. Similarly, a disciplined programmer learns the vocabulary and syntax of a programming language with utmost sincerity. Every word in the programming language is sacred, and any non-conformance is akin to blasphemy.\n","title":"STR: A Disciplined Programmer","type":"posts"},{"content":"An extraordinary year comes to an end, just like many others, yet I distinctly remember that one year when a remarkable entourage of human beings gathered around me. I had hoped our story would have a happy ending, but sometimes wishes don\u0026rsquo;t come true. Now, staring at the glass wall, I see only myself, alone, burdened with a heavy bag of despair and endless remorse. The seeds I had sown were now bearing their bitter fruit. But my story wasn\u0026rsquo;t always like this.\nThere was a time when I saw her in my reflection. Her enchanting beauty, meticulously refracted through her glasses, captivated me. No matter how small my angle of incidence, the image I received was larger than life. Deep down, I never anticipated that my angle would diminish to a critical point, plunging me into eternal darkness. My story began with a plasma wall, commonly known as a monitor or personal computer, where I first learned about this incredible person. I don\u0026rsquo;t know when we became close, but somehow, streams of binary bits connected us. Those simple sequences of zeros and ones profoundly impacted my life, and they still do.\nAt times, I find myself staring at that same plasma wall, reading past conversations. The color has faded; there\u0026rsquo;s no color at all. It makes no sense to revisit a past that was once colorful and radiant, yet I do, reading some paragraphs so many times that I no longer need a screen to see them. They come naturally.\nIn the beginning, I chased many dreams. Some came true, but most failed. When dreams fail, the glass wall in which I see myself sometimes shatters, its fragments piercing me, leaving me asphyxiated and bleeding. Then came a day when I lost everything and found myself drowning in an ocean of melancholy, unable to scream or signal for help. But I found comfort in a surprising place.\nEvery nightmare ends, and so did mine. I felt the gentle touch of someone on my forehead, a reminder that the world hadn\u0026rsquo;t ended. Everyone is haunted by their past, but some carry haunted stories. Reaching their zenith, they see their problems as minuscule and keep moving forward, fueled by each challenge.\nI came to know her. Slowly, I cleared the mist from my glass wall and saw her as my reflection. I had found the love of my life. Love ballads and romantic literature suddenly made sense. I began to dream again and regained my life. It seemed like a stroke of luck, as if a talisman had been placed under my pillow.\nBut one day, my reflection started to fade. I watched her leave, unable to stop the widening cracks. She tried to mend them but only got hurt in the process. I never wanted that for her. I wanted the best for her, but I ultimately failed. The crack remains, my reflection now distorted. Yet, I felt grateful to have been loved by her. I left behind a silent tear and a hopeful wish, knowing it was impossible for us to converge again in this space and time. The winter chill froze me inside, deepening my coldness.\nYears later, I see her again. I had always hoped for a meaningful beginning and a strong ending to our story. I searched for my glass wall, finding only distorted fragments in my attic. Now, I see the love of my life. She\u0026rsquo;s happier – married, successful, with Ivy League degrees and beautiful children. I long to cross the street, talk to her, and express my undying love. But I don\u0026rsquo;t, or rather, I can\u0026rsquo;t. Not out of fear of societal judgment, but because I want to preserve the dream she once had. I wish to do something as simple as holding her hand and promising never to let go. But I still don\u0026rsquo;t, or rather, I can\u0026rsquo;t. Long ago, I left this world\u0026hellip;\n","date":"December 31, 2022","externalUrl":null,"permalink":"/posts/the-glass-wall/","section":"Posts","summary":"An extraordinary year comes to an end, just like many others, yet I distinctly remember that one year when a remarkable entourage of human beings gathered around me. I had hoped our story would have a happy ending, but sometimes wishes don’t come true. Now, staring at the glass wall, I see only myself, alone, burdened with a heavy bag of despair and endless remorse. The seeds I had sown were now bearing their bitter fruit. But my story wasn’t always like this.\n","title":"The Glass Wall","type":"posts"},{"content":"","externalUrl":null,"permalink":"/authors/","section":"Authors","summary":"","title":"Authors","type":"authors"},{"content":"","externalUrl":null,"permalink":"/categories/","section":"Categories","summary":"","title":"Categories","type":"categories"},{"content":"","externalUrl":null,"permalink":"/series/","section":"Series","summary":"","title":"Series","type":"series"}]