T-SQL Tuesday #133: Learn by Presenting

I talk way too fast, I want audience participation, and I always make stupid typos.

Well folks, it’s been a minute. How’s your pandemic going? Hopefully safe and healthy.

This month’s blog party is brought to you by Lisa GB, and the topic is “What (else) have you learned by Presenting?” — either giving a talk at a big conference, a SQL Saturday or meetup, or (like me, because I’m still working my way up to it) just to your own little group of coworkers.

I don’t have that much presentation experience under my belt yet. After COVID started, I briefly tried my hand at Twitch streaming, only to walk away dejected and disillusioned because nobody watched. (Probably blame the content as much as the presenter, but whatever.) I did give a few “big room” talks at work last year, which went pretty well, and I got some great feedback, including helpful constructive criticism — talk slower, keep things on-topic and relevant to the work we’re doing, etc. All good stuff.

Lately, I’ve been doing brief “TSQL Tuesday” video-meetings with very small groups of colleagues (usually only 2-5 people show up any given week). We talk about whatever database-related points of interest have come up during our day-to-day grind. Sometimes it’s completely spur-of-the-moment. At least a few times, one of my stellar new BA’s (also QA/Tester), brought something to the table that even *I* wasn’t familiar with, like the cool IIF function in TSQL, or the EXCEPT UNION EXCEPT method for comparing homogeneous data-sets and the related concept of “bucket chemistry” with large buckets of data in preliminary analysis. (Sounds like a blog post that someone should write.)

Here are the things I’ve learned. Well, besides those mentioned above.

A. I really do talk way too fast when I’m presenting. Still. Trying to work on it.
B. I enjoy audience interaction much more than monologue-ing.
C. Never type in demos.

And finally, I’ll leave this as a teaser for a future post. While working on some demo material for “#TempTables – Use or Abuse?”, I got super confused about temp-table scope in context of stored-procs that abort or fail. So for a moment I understood why developers insisted on putting DROP TABLE #MyTempTable at the end of every one of their procs — it’s basically a safeguard, with no legitimate downside. Anyway, as I read into things and did a bit of research, I found this gem about how temp-tables are cached, and realized that I’d been committing a particular sin (select into #temp, followed by add columns for fetching more data) for a long time without realizing its risks.

Stay safe and healthy friends!

PS: there’s been a bunch of twitter on Twitter about PASS, people resigning left and right, and I’ve no idea what’s going on, but I’ll get the popcorn, cuz it’s sure to get entertaining. =D

gif of michael jackson eating popcorn

TSQL Tue..Wednesday: Analogy Time!

Tuwednesday. Blursday. It’s all the same day now, isn’t it? Really this post is just about showing off some incredibly spot-on analogies by fellow #sqlfamily bloggers.

And of course you start with the creme-de-la-creme, Mr. Ozar. We’re having our dessert first, you see. Life’s short, live a little, etc. The one-liner: the database is a walk-in fridge in the back of a restaurant; the DBA is the refrigerator technician; the developer is a chef that works in the restaurant to which that walk-in fridge belongs.

Speaking of food, Deb Melkin describes how execution-plans (query plans) are like a routine for going out and running your weekly errands. And how small changes can sometimes lead to incredibly large differences in the execution of said plans.

Jeff Ianucci has another restaurant-related metaphor. Boy, us DBAs sure are a hungry bunch, eh? And thirsty! Shane O’Neill regales an imaginary youngster in his native dialect about how databases are like whiskey cabinets. It’s true, sometimes the job does drive you to drink. Especially when a developer leaves the door to the walk-in-fridge open.

So what do I tell people when they ask me “What.. what would you say, you DO here?” Well. Let’s say the company is like an animal, right? Any kind of mammal, let’s say. Leopard, Zebra, Elephant, Platypus. Whatever you fancy. The organism’s body functions primarily because of the blood in its circulatory system. Blood delivers oxygen, transmits nutrients, and moves throughout the entire body. So, too, does DATA. It delivers valuable information, transmits decision-making-fuel, and moves throughout the organization. Without data, the company cannot function. It cannot know how many customers are coming in or leaving, how successful a marketing campaign has been, whether a product is selling, or when the next anticipated failure of a WidgetMaker9000 machine might be so they can bring in a service-tech to inspect it.

All of these “measures” are things that the company heads care about. And none of them are knowable without the data. Sure, you can wave your hand around and produce some gut-feel “SWAGs” (sophisticated wild-ass guesses), and if you’re lucky, you’ll only be wrong half the time… maybe 1/3rd if you’ve got some experience behind you. But data has the power to actually tell you those answers. All you have to do is listen.

It’s here that the analogy breaks down a bit, because we assume that, while you as an organism have very little power over where and how your bloodstream is organized & flows, you as a data professional (and as a technology-invested company, on the whole) have PLENTY of that power over your data. You need to invest in the right infrastructure, the right people, the right technology, and the right models, for your business. And much like the searing pain of heartbreak, if you discover that so much of that data-circulatory-system is wrong that it’s killing your organism (organization), “rip it out, and start again.”

Oi. Now I sound like I’m comparing DBAs to gods. Don’t worry, I’ve had enough humble-pie lately to dash that illusion to smithereens, thank you. Now go enjoy some good food and whiskey. ❤

T-SQL Tuesday #122: Impostors!

Be humble, but be confident in your own expertise. Realize that you will sometimes be the teacher, and sometimes the student.

Howdy folks! It’s been a while. My last post was in October of last decade (har har)! I’ve been busy with other projects, work, and life in general. 🙂

Now, this month’s invitation is hosted by John Shaulis. I’m always excited to see someone in the #SQLcommunity that I’ve never heard of before. Looks like he’s familiar with the same WordPress templates as I am. Heh. I like it.

Side-note, I’ve been planning to buy my own domain and remove the ads here for a better reading experience (and more professional feel), so hang in there… it’s happening soon!

And now without further ado.

Who, Me?

Everybody reading this knows what impostor syndrome is. I won’t bore you with that. Fact is, if you’re in tech, you either have it or you know someone who does. If neither, well you’re probably a narcissist surrounded by other narcissists and you might wanna look for another job. Heh heh.

oh, get a job? just get a job? why don't I strap on my job helmet, get into a job cannon, and fire off into job land, where jobs grow on jobbies!
I think this is from “It’s Always Sunny in Philadelphia” but I’ve never seen the show.

Yes, You!

I’ll share a personal and ongoing example of where I’m constantly feeling “not good enough at my job”. It’s about server migrations. You know, where you take a running production SQL Server instance, and you have to move it to new hardware/infrastructure. Maybe it’s a physical (bare metal) cluster to a virtualized environment. Maybe it’s just one VM to another VM on a newer platform. It typically involves taking a maintenance window and convincing business stakeholders that the “downtime” is worthwhile.

Now surely we’ve all heard of DBATools by now, right? (If not, go there right now and check ’em out!) They were BUILT to do migrations. Yet I’ve not been able to successfully use them as the whole and ONLY toolset for that purpose. Let me clarify: I’ve used pieces of the framework (such as the copy logins / users command, and others) to HELP me in the migration project, but never been able to simply fire off Start-DbaMigration and go get coffee and watch as things magically successfully fall into place.

dog in a room on fire, casually drinking coffee, saying "this is fine."
It’s working EXACTLY AS IT SHOULD!

Couldn’t Be!

Even when I’ve built a beautiful check-list, lined up all my scripts, scheduled things via Agent Jobs, double-checked names and permissions and networks and drive letters and shares. Still, something inevitably goes wrong. Maybe it’s just that a database refuses to go into READ_ONLY mode when I ask it to. Maybe it’s that a whole series of jobs get lost because they were in the wrong category. Maybe user permissions don’t come over on this database because I didn’t sacrifice a chicken on the night of the blood-moon while the wind blew north-east.

Surely, by this point in my 10+ years as a database professional, 4 years at this particular company and role, I would be able to do this with my eyes closed. With no hiccups or misfires. Right? RIGHT??

murphy's law: if anything can go wrong, it will go wrong.
Full plaque samples, which are quite entertaining, can be seen here & here, to name a few.

Then Who?

Wrong. Murphy’s Law is a real thing. More than that, things CHANGE. The environment is always evolving to fit the business needs. The technology is always just a little ahead of my own knowledge-base. And every SQL instance in this environment is, unfortunately, still, a ‘pet‘. Not a ‘cattle‘ (cow?). They’re each carefully and fearfully constructed to suit a specific set of application and business needs, each with their own nuances and barely-documented dependencies.

Plus, it’s not like we’re doing these migrations very often. Anything you don’t do on a constant basis, at least a few times per week, tends to sink back toward the bottom of the mental stack, and you forget the details and gotchas that were involved because your brain needs to keep more important and relevant things near the top, for what you’re working on “right now”. This is normal, at least in my mind.

DevOps stole the cookie from the cookie jar!

Just hire a DevOps Engineer to solve all your problems. Then everything will be all sunshine and rainbows and unicorns. All your servers will be cattle, replaceable little containers that are allowed to fail because there’s always another one waiting to spin-up and replace it. Not like your DATA is important or anything. It’s only the lifeblood of the entire company.

Not really sure where I was going with this. I love DevOps, don’t get me wrong, and it does have a part to play in making the traditional RDBMS platform more agile, but the devilish details and difficulties therein are ALWAYS overlooked/brushed-off in popular discourse, and it’s got me a little jaded.

I mostly wanted to finish the song I’d started with the title-blocks. =P

N.

But srsly. Instead of feeling guilty about your impostor syndrome, just own it. Be humble, but be confident in your own expertise. Realize that you will sometimes be the teacher, and sometimes the student. And hopefully more often the latter, because there’s exponentially more to learn every year we evolve on this crazy spinning rock we call Earth.

live as if you were to die tomorrow. learn as if you were to live forever. -Mahatma Gandhi
Good advice in general.

T-SQL Tuesday #119: Change of Mind

Get up, come on get down with the SPACES!

Because I’m super late (as usual), about to go to bed (because I’ll be commuting tomorrow), and lazy (aren’t we all?), this will be a quickie.

This month’s #tsqltuesday hosted by the amazing Alex Yates, in which he asks us to discuss something about which we’ve changed our minds over the course of our career (or some subset of time therein).

I’m really excited to read some of the submissions I’ve skimmed on the Twitter feed so far, such as Oracle vs. MS-SQL and the importance of diversity. After all, what else am I gonna do while I sit in the vanpool for 3 hours? (round-trip, thankfully, not one-way!)

Tabs vs. Spaces

Oh my! Them’s fightin’ words. Even back in the early days of this very blog, I wrote about my preference for tabs. But now.. *gasp*.. I’m down with the spaces!

blasphemy-300
Blasphemy of the highest order!

Why, you ask?

Well, partially because I’ve changed some of my overall T-SQL coding style preferences and methods of construction. When I learned the Alt-Shift select method (block selection, aka vertical selection) in SSMS, it definitely set me on a track away from tabs. Now I don’t go all cray-cray with vertically aligned sections/clauses/etc. too much, but I will say that in certain instances, it’s made the query much easier to read. And in such instances, spaces definitely trump tabs for ease-of-use with said vertical-alignment efforts.

If you’re not sure what I’m talking about (because, admittedly, it’s hard to write about and much easier to show visually), just search Youtube for an example of SSMS block-select tricks.

And this is within the last 4 years, so I still find old stored-procs that I’ve written that have the tabs, and I chuckle slightly to myself as I Ctrl-K-Y (that’s the Red Gate SQLPrompt shortcut to ‘format this code in my current style’) and make my modifications.

Miscellaneous Little Things

I’ve developed some other preferences, too, which contradict some of my old formative-years’ habits. For example, I used to write my TSQL in pure lowercase. I now prefer the ANSI-CAPS for language constructs and keywords, but if I ever need to write dynamic-SQL, it goes in lowercase.

caps-lock-not-always-necessary
I even re-used old images, instead of finding new ones. LAAAZZZYY!! :O)

Some of these habits come from Aaron Bertrand and other SQL-community big-name bloggers. Like preferring CONVERT over CAST, or changing from trailing-commas to leading-commas. (Although he may have flipped on that again, I can’t remember.) While others just kinda happened organically. Like, two tabs for the ON line under each JOIN — the join predicate — just felt silly after a while, so I reverted to one. I used to be stickler for forcibly quoting identifiers that collided with language keywords — like if you have a column named Date or Value, you best be puttin them square-brackets around those suckers ([Date], [Value]), but now.. honestly, I don’t care enough to bother. Unless you do something really heinous, like timestamp. =P

Anyway, that’s all I have for now. There are much more important things that I could, and should have, written about, but as I said, and as always, I’m already late to the party. ‘Til next time! ❤

T-SQL Tuesday 118: Fantasy Feature

It really shouldn’t be this difficult. But it is. And that’s why we get paid. Still, it’d be nice if load-testing were easier, wouldn’t it?

Aka “hey look at us Microsoft, we want stuff!!” Because they shut down Connect and its replacement (Azure Feedback aka rebranded UserVoice) is awful. Just plain terrible. In unrelated news, I’ve never been an MVP.. wonder why? 🙄😜

Anyhoo, this month’s invitation is brought to us by the lovely and talented Kevin Chant. He asks us to fantasize about SQL Server. No, not like that Erik, get your mind out of the gutter.

And I don’t apologize. At all.

This IS a complaint. But hopefully it’s also an idea for those who are better at building stuff than me to.. ya know.. build stuff.

And shout-out to my favorite fellow blogger Shane (@SOZDBA) who’s too polite for his own good. ❤

Load Testing is HARD.

Too hard. So hard that nobody does it. At least not productively, efficiently, or willingly. About the only times that I can personally point out an instance where I’ve actually buckled down and done something roughly comparable to a true load test (which I’ll talk about in a minute), was when I was forced to do so by my managers to prove that a hardware environment upgrade hadn’t gone awry, and that our servers truly were running at least as good as, if not better than, before.

But guess what? We never really proved anything conclusively. We had inklings, feelings, warm fuzzies.. okay maybe a DiskSpd output file or two.. which indicated that things were “mostly probably pretty good and kinda sorta better.”

Why? BECAUSE IT’S FREAKING HARD.

What is “True Load Testing”?

a graphic representing software load testing
Something like that… maybe?

Glad you asked. Simply put, it’s the ability to execute these 3 steps, easily and efficiently, with minimal configuration overhead and without needing to pour agonizingly over tomes of docs:

  1. Capture and store a SQL server workload — i.e. ALL the transactions run against an instance in a given time frame — AND performance metrics from said instance while said workload was running.
  2. Run that captured workload against another SQL server instance, and capture THE SAME performance metrics.
  3. Compare results from each set of gathered performance metrics, with a concise, easy to understand rating system that tells you which instance ran better and why.

Now, could we argue that the “capturing” of #1 adds some overhead to the instance? Sure! So I’m fine with NOT gathering the performance metrics during the same window as the workload-capture. Put it off to step 2, where we replay said workload against 1 or more instances and measure the performance on them. So we could replay the same workload on the original instance, and a new one, and we’d have our two sets of measurements to compare.

Got me? Good.

The Plumbing is There

I know. I know what you’re screaming at your monitor/screen right now. “But Nate, that’s exactly what Distributed Replay is for!!”

Bruh, have you even USED Distributed Replay?!? It’s way too complicated to set up, let alone manage and operate. Remember what I said about tomes of docs? Yeah. ANGTFT.

That’s the sad part. Microsoft has built up the plumbing and scaffolding for all of this over the past few decades. But we’ve yet to see that final layer, that chrome polish and finishing touch that makes the user go “Ahhh, now THAT was an educational and enjoyable experience!”

red easy button
Staples’ trademark be damned!

Obviously when it comes to performance metrics we’ve got a huge wealth of knowledge in the system DMVs. Great! Now let’s condense and simplify those into like 4 key ratings of your instance, for those of us who aren’t Paul Randal or Glenn Berry.

Oh, 3rd party monitoring products you say? Sure! Great! Love em. Do they do what I just said? Nope. Because “it depends.” Anybody else sick of hearing that?

It Really Shouldn’t Be This Difficult

But that’s why we get paid. Because it is. And no matter how many cloud services Azure & AWS try to shove down our throats, the reality is that enterprises will continue to rely on human engineers to prove (or disprove) that NewFancyServerX is better than OldCrappyServerA for running YourTerribleSqlWorkloadZ.

Because we can’t architect perfection. And we live in the real world where business decisions and financial constraints have an actual measurable impact on our technology stack choices and roadmaps. So I’m not saying it’s inexcusable that we don’t have this — this easy, measurable, understandable toolset for performance-load-testing — yet. I’m just saying it’s mildly annoying. And perhaps a little frustrating.

With that, I think I’ve written two angry rant-y posts in a row, so I do apologize to you, dear reader (but not, and never, to Microsoft). I’ll leave you with this cute picture of my dog being ridiculous, because it always makes me smile. Til next time!

husky dog laying on back in silly position
doggo being doggo

Follow-up: Cribbage “15’s Counter”

The actual method involves joining 5 copies of the table together, by each right-side table only including cards with higher ID values than the table to its left.

To be honest, my T-SQL Tuesday puzzle was a bit of a last-minute idea, which is why I didn’t have a solution ready-made. But, dear reader, you’re in luck! I have one now.

The code is over here in Gist. You can read thru it, but since the final query — the actual “answer” — is kinda ugly, let me explain my thought process.

Modeling is Important

Even when I’m putting together a silly little demo script like this, I feel that good habits and fundamentals are important. You never know what future developer might read it, copy-paste it, and say to themselves “Cool, I’m gonna follow this example when I do this other thing over here!” So you’ll see my formatting preferences, naming convention (though I must admit, I argued with myself over whether to pluralize the table names or not!), and correctly allocated Primary Keys. And since we’re modeling a card deck, even though I didn’t need to store the ‘NumValue’ (which is what you’d use for a straight/run, where the Jack is 11, Queen is 12, etc.), I did anyway.

Now, when we set up our “Hands”, we’re going to use two ‘PlayerNum’s, just so we can test two different hands at the same time. Cribbage can be played with 3 or 4 players, but we’re keeping this simple. Also, I could have built the hands more aesthetically, i.e. by selecting from Cards using PtValue and Suit, but again, I was trying to script quickly, so I just used the IDs that I knew from the previous query (the “full deck”). And again, there’s a “little extra” tidbit, the ‘IsCut’ indicator — we won’t be using that right now. If you’re still not sure what that means, go read the rules.

The Method

At the end of the original post, I mentioned loops and cursors as possible routes to a solution. That may still be true, but I decided to challenge myself to avoid them. Not because they’re “always bad”, as popular media would have you believe; they’re just often an indicator that a developer isn’t thinking in set-theory when they probably should be.

Let’s start with some basic principles. You have 5 cards in your hand. It takes a minimum of two cards to make 15 (examples include Jack+5, 6+9, etc.), and up to a maximum of.. you guessed it, five cards. So we need to check all combinations of any two, three, four, or five cards. We cannot re-use a card within the same combination; and putting the same three cards in a different order, for example, does NOT count as a separate combo (another ’15’).

So as you start to think about these rules, and if you’ve been around data for a while, especially data with identity values, you might have a little light-bulb. “Aha! I know how to do that. We can simply order the combos by the ID value, and that way we won’t allow duplicates!” And that’s kinda what I did, by enforcing the JOIN predicates that every subsequent derived-table have a ‘CardID’ greater than the prior one. But I’m getting ahead of myself.

The actual method here involves JOINing 5 copies of the table together, mainly just on PlayerNum, but also, as I said, by each right-side table only including cards with higher ID values than the left-side. In this way, we ensure that we’re not allowing the same cards to be “joined” to each other, i.e. we’re removing them from the right-side tables.

And finally, we have four OR‘d conditions: simply “do any of those combinations add up to 15, by the Card’s PtValue?” These are echo’d in the CASE-expression in the SELECT line, where we want to essentially “show the combo”, i.e. tell you what cards make up the ’15’. (Again, for style’s sake, we have an ELSE, but we don’t really need it because it’ll never actually happen.)

Now, it does look kinda ugly. It’s not very extensible — meaning, if you wanted to scale it up to find the ’15’s in a 6- or 7-card hand, or you wanted to look for other kinds of combos (like ’18’s or ’27’s), you’d end up re-writing a good portion of it, or at least copy-pasting a lot. Fortunately for us, Cribbage is fairly simple in this regard — your hand is always the same size, and you only ever care about ’15’s.

(Well, and pairs, 3- and 4-of-a-kinds, straights, flushes, knobs, etc., but again, read the rules if you’re curious. We kept this very simple by limiting ourselves to just one small fraction of the game mechanics.)

The cool thing about this sample, though, at least to me, is that you’re already set up to build on it if you want to try out other Cribbage mechanics. Or even other card games, if you just use the base Suits & Cards.

What Did We Learn?

What’s the point of a puzzle like this? Well, besides introducing you to a fantastic card game, if you didn’t already know about it. The point is to make your brain think in a different way than usual. Are any of us programming card games using a SQL back-end? Probably not. (Although an in-memory equivalent like SQLite or something might be viable!) But the next time you have a “combinations problem” with some real-world data, you might wonder if a method like this could come in handy. Or at least, if it could work out better than a double-nested-loop. =)

PS: I believe, instead of the LEFT JOIN​s, we could have used OUTER APPLYs. We’d move the conditions from the JOINs into the inner WHERE clause of each derived table, i.e. “this ID > previous ID” and “PlayerNums are equal”. If you’re curious, try it out!

T-SQL Tuesday #114: A Puzzle

One of the main things a new cribbage player needs to learn is how to easily spot the combos that make ‘a 15’ (the ways to combine cards to add up to a numeric value of 15). Let’s do that with SQL!

It’s that time again! The 2nd Tuesday of the month, T-SQL Tuesday. This month’s invitation is on the lighter side, which is nice, and it comes from Matthew McGiffen (b | t). The theme is “Puzzle Party!” And I’m going to cheat, since it’s getting horribly late already and I’m lacking in inspiration.

So, I propose a puzzle! Which you must solve using SQL. Then I’ll post my own solution in a day or two. Bwahahaha.

I actually really wanted to do a Sudoku solver, but @SQLRnnr beat me to it. By a few years. =P   I might still work on that when I’m bored, just to have a standby for another blog post. Maybe we’ll compare notes.

But for now…

Do You Even Cribbage, Bro?

If you’ve never heard of the card game cribbage, it might sound weird. When you read the rules, it sounds even weirder. Legend has it that it was invented by drunk Englishmen in a pub. Reality is actually not that far off. It’s also heavily played by Navy submariners, and that’s how it was passed down in my family.

There are already many great mobile & web versions of the game, and it will quickly become obvious to anyone who’s tried to program a card game before, that a query language like T-SQL is NOT suited (omg see what I did there?) to the task. However, we can probably come up with a small sub-task of the game that’s acceptable for our purposes.

Enter: the hand scorer. There’s a nice example of a finished product here. The input would be a set of 5 ‘cards’ — the ‘hand’ has 4, and the ‘cut’ adds 1 more, used as part of each player’s hand in scoring (like community property). A ‘card’ is simply an alphanumeric value — 1-10 plus JQK (which are ‘worth’ 10 for arithmetic, but can be used like normal for ‘straights’ aka ‘runs’) — and a ‘suit’ (heart, spade, diamond, club). Think for a moment on how you’d store that as a data structure.

The output, then, is a single numeric value, the ‘score’. But how do you score? You look for the following: combinations of any numeric values that add up to 15; pairs, 3-of-a-kinds, or 4-of-a-kinds; straights (suit does not matter); a flush, if all 4 ‘hand’ cards are the same suit (and a bonus point if the ‘cut’ card matches as well). And then there’s a funky thing where you get an extra point if you have a Jack that matches the suite of the ‘cut’ card. o_@

Dude… What?

Wow, that sounds complicated, no? Let’s make it simpler. One of the main things a new cribbage player needs to learn is how to easily spot the combos that make ‘a 15′ (the ways to combine cards to add up to a numeric value of 15). For each ’15’ you make, you score 2 points. That sounds pretty feasible in SQL, right?

For starters, we don’t really care about suit anymore. But we do need some way to distinguish the cards from each other. This is a single-deck game, so you’re never going to have more than 4 of the same number; never more than one of the same card (like the Ace of Spaces). And when you’re counting combinations (or is it permutations?), you can’t use the same card twice. So let’s still use the suits for card distinction; I’ll just suffix the number with an ‘h’, ‘s’, ‘d’, or ‘c’.

We also don’t care about differentiating a 10 or J/Q/K, since they’re all just worth 10, numerically. So your ‘input’ can just consist of five numbers between 1 and 10. Cool? Just find the ’15’s!

Example:

  • Your hand is 3h, 6s, 6d, 9c, and the ‘cut’ is 3c.
  • Combos for ’15’: 6s+9c, 6d+9c, 3h+3c+9c, 3h+6s+6d, 3c+6s+6d.

That’s five unique combos, for a total of 10 points! Good job, that’s a bit better than average hand. In cribbage lingo, you’d say it like so: “fifteen two, fifteen four, fifteen six, fifteen eight, and fifteen ten.” Or if you’re playing with more experience, you’d abbreviate to simply “two four six eight ten”.

In “normal” programming land, we’d probably use a loop and some branching logic. What will we do in SQL? A loop, a cursor, or something more (or less!) elegant? You decide!

I’ll come up with something solution-y soon. Update: Solution posted! Enjoy! ❤

cribbage board close-up of winning peg and partial hand
Red won by 2 points! Close game.

T-SQL Tuesday #113: Personal-Use Databases

So when I dived down the rabbit-hole of the Nested Set Model, of course I created a sample database to write & test the code against.

tsql2sday150x150It’s that time again! This month, Todd Kleinhans (b/t) asks us how we use databases in our day to day life, i.e. personal use or “outside of our work / day-job”. Actually, the question is kinda vague — because if you think about it, we all use TONS of databases in our daily lives. Your phone’s contact list, your calendar, online shopping, banking.. the list goes on. As I’ve said before, Data is everything.

But what I think he meant, and the way most of the community has interpreted it, is “How do you manage/administrate/build/work-with/develop databases in your day-to-day life outside of work?”. So we’ll go with that.

Now this may out me as “not a real DBA” or some such nonsense, but honestly.. I don’t spend much of my time creating silly playground databases. Not that anybody else’s are ‘silly’ — just look at some of the fantastic posts for this month! Such neat ideas brought to life.

Special shout-out to Kenneth Fisher, who, if you look closely at his screenshot (and it’s not even related to this post), committed the abhorrent sin of creating a database name of pure emojis — FOR SHAME sir! But also you’re awesome. ❤

Me, I’m more of a quick-n-dirty spreadsheet guy. If I need, say, an inventory of my computer parts & gadgets so I know what I can & can’t repair, what materials I have to work with as I tinker, etc.. well, I create a Google Sheet. And it serves my needs well enough. (Spoiler alert: yes, you can view that one; I shared it. But it’s fairly outdated since I moved in March and haven’t had time to re-do inventory since.. last autumn.)

But for blogging in the tech field, you gotta get your hands dirty. So when I dived down the rabbit-hole of the Nested Set Modelof course I created a sample database to write & test the code against. And there have been some additional bits & pieces for blog demos and GitHub samples.

Most of the time, I’m creating databases / entities on SQL 2016 Developer Edition. Of course by now, that’s 2 major versions ‘behind’, but since I don’t run Linux personally (yet?), and I’m not a conference speaker (yet??), I don’t feel a burning need to upgrade. It’s FAR superior to Express Edition, though, so please for the love of all that is holy, if you find yourself using Express for personal/playground use, save yourself the headache and go grab Developer.

Containers/Docker? Meh. If you want to start playing with those, definitely look at 2017 or higher. It sounds appealing in theory — “just spin it up when you need it, spin it down when you don’t!” — and I’m sure that’s great if you’re starved for resources on whatever laptop you’re working with, but if you’ve done your due diligence and set your local SQL instance to appropriate resource limitations (hello, ‘max server memory’ and file-growths!), I’ve found that its impact is quite tolerable.

But come now. Surely this isn’t just a “shameless self-promotion” post or a grumpy-old-DBA “get off my lawn” post. Right?? Right!

To you folks out there creating your own nifty little databases for personal projects, learning/development, or even hopes & dreams of building a killer app on top of it one day — you’re amazing! Keep doing what you do, and write about it, because I love reading about it. Heck, go try to create the same model in PostgreSQL or MariaDB and see how it goes. We could all use a little cross-stack exposure once in a while.

That’s all I have for this month; short & sweet. I need to finalize plans for virtualizing our main SQL instances (which is really just a migration off bare-metal & onto VMs) within the coming weeks. Yes, we’re that far behind the curve. Now get off my lawn!

=P

clint eastwood frowning angrily
I’m old and racist! But I’m still adorable for some reason!

T-SQL Tuesday #112: Cookies!!

..this analogy of “dipping into the cookie jar”. What events or accomplishments can I take sustenance from, draw strength from, during these times?

tsql2sday150x150

Hi folks. It’s been a minute. Frankly it’s been a rough 5 months. On the career front, I’m faced with the challenge of virtualizing our core business-critical SQL instances with minimal downtime. And, because of personal/life stuff, it’s been difficult to stay focused and productive.

So this #tsql2sday‘s topic is poignant, I suppose — this analogy of “dipping into the cookie jar”. What events or accomplishments can I take sustenance from, draw strength from, during these times?

As usual, we must thank our host, Shane O’Neill (b|t), and remind readers to check out tsqltuesday.com.

Encouragement

Back when I first took this current job, I was worried sick about doing the commute every day. One and a half to two hours in traffic each way. Even if I used toll-roads, it might save 10-15 minutes, but it was still super stressful. But my family never stopped encouraging me, telling me it would pay off. And, when the time was right, I should have the telecommute/remote-work conversation with management.

And of course, to nobody’s surprise, they were right. I now work from home 4 days a week, take a vanpool the 5th day, and am much happier and more productive (in general!), much less stressed, and a markedly better driver.

Accomplishment

One of the first big projects on my plate was upgrading & migrating the SQL 2005 instances to new hardware and SQL 2016. We didn’t use anything super fancy for the actual migration, just backup shipping, essentially. DbaTools came in very handy for copying over the logins/users without having to hash/unhash passwords. The main hiccups were around Agent Jobs and Replication. Jobs were turned off at the old server before they were enabled on the new, but due to lack of documentation and understanding of some, it became difficult to see which ones were critically needed “now” vs. which could wait a while, and which may have dependencies on other SQL instances (via linked-servers) that may or may not have moved.

And replication is just a PITA in general. Fortunately, I took this as a ‘growth opportunity’ to more fully document and understand the replication environment we were dealing with. So at the end of the project, we had a full list of replication pub-subs with their articles, a sense of how long they each took to re-initialize, and a decision-process to consult when adding-to or updating articles within them.

Continuous Learning

A similar upgrade-migration project came to fruition in 2017: the ERP system upgrade. This was a delicious combo meal of both the database instance upgrade (SQL 2008R2 to 2016) and the application & related services/middleware (Dynamics NAV 2009 to 2017). And as I blogged about, it came with a super sticky horrible side-effect that we’re still dealing with over a year later: a different collation than the rest of our DB environment.

Which reminds me, I need to do some follow-up posts to that. Briefly, the “best” band-aids so far have been thus:

  1. If only dealing with the ERP entities, without joins to other DBs, just leave collations alone. The presentation layer won’t care (whether that SSRS or a .NET app).
  2. If relating (joining) ERP entities to other DB entities, I’ll load the ERP data into temp-tables that use the other DB’s collation, then join to those temp-tables. The “conversion” is essentially ‘free on write’, meaning when we load the weird-collation data into a temp-table that’s using the normal-collation, there’s no penalty for it.

As I said, I’ll try to dive into this more in a future post. It’s been a life-saver for performance problems that cropped up as a result of the upgrade & the different collations.

But the point here is that, even though this project didn’t end up as wildly successful as we’d hoped, it’s still a success, because we learned some key lessons and were able to pivot effectively to address the problems in a reasonable way. And frankly, there was no going back anyway; it’s not like the business would have said “Oh, never mind, we’ll just stick with the old versions of everything” (even though some reticent managers would probably have enjoyed that!). So even when things seem bleak, there’s always a way around.

when the going gets tough, the tough take a coffee break
Wait a minute…

Conclusion

I’m still trying to be the very best DBA & BI-Dev I can, supporting the dozens of requests & projects that the business throws at us. Fortunately, with the incredible SQL Community, a wonderfully supportive family, and a fantastic team of colleagues, I remember how far I’ve come, on whose shoulders I stand, and how much promise the future holds.

T-SQL Tuesday 106: 5 Things Not to Do With Triggers

Here are a handful of anti-patterns that I’ve seen with triggers in my time…

This month’s invite comes once again from our benevolent overlord* community pillar Steve Jones, head of SQLServerCentral.com! If you haven’t been there, stop reading immediately and go check out the helpful forums and ‘Stairways’ articles. Some truly excellent content to be had.

No, don’t stop reading immediately… save it to your favorites/reading-list and look at it in 5 minutes when you’re done here.  =)

*Though I’ve not had the pleasure of meeting him in person, I’ve heard Steve is a phenomenally humble and down-to-earth guy, so my silly comment about him is even sillier in that light. ❤

Triggers – Love ’em or Hate ’em

Borrowing a bit of a mini-game from the CodingBlocks guys, the first “Google-feud” (auto-complete result) for “sql server triggers are ” is sql server are triggers bad.  Well, they can be, if used improperly / to excess.  Like most bits of tech, they have their place and their use-cases.

I’ve blogged about a few such use-cases here (a “who did what” audit-trail type of trigger) and here (as part of the Nested Set Model implementation), which you should definitely read. I didn’t condone click-baity titles back then. Alas…

click this right now omg
All we need now is a “blink” tag…

Here are handful of anti-patterns that I’ve seen with triggers in my time.

Thing 1: Using them as Queues

Repeat after me:

A trigger is not a queue.

Triggers are executed within the same transaction as the query that fires them. Meaning, they’re subject to all that ACID-y goodness of a transactional system. This is a double-edged sword. If all is successful, it guarantees that trigger will do its job when the calling query runs, which is great for audit-ability. On the other hand, if the trigger has a problem, anything and everything that triggers it will also fail.

The fundamental distinction between this and a queue, is that the success of the queued action is not immediately critical to the continued operation of the thing that called (queued) it.

So if your requirement matches the latter behavior, and not the former, do us all a favor and use a real queue. Heck, find one of the few people who know Service Broker. (Hint: one’s a Warewolf, another’s a Poolboy.)

Thing 2: Making them WAY TOO BIG

Mostly because of the transactional thing, the rule of thumb with triggers is K.I.S.S. “Keep it Small and Simple.” Even the audit-trail example is a bit much if the table being audited is under significant write load. Typically, if the business requirements are both high-throughput and high audit-ability, you’ll be implementing a much more complicated tech-stack than just plain ol’ SQL Server with triggers.

Some of the offenders I’ve seen include: A trigger that wanted to write to several other tables with IF conditions that branched based on what column was being updated. And a trigger that required near-SA level permissions to do some back-end maintenance-y stuff. Those are both recipes for problem pie.

pie that looks like a kraken attacking a boat
We’re gonna need a bigger… fork?

Thing 3: Doing Nothing Useful

Somewhat opposite of above, there’s no point in introducing the management and performance overhead of triggers if they’re not performing a transaction-critical operation. For instance, something better left to a queue.

Thing 4: Housing Business Logic

There’s always been a holy war about whether “business logic” belongs in the database or in the application code. And, since geeks love sub-classifying things to the Nth degree, there’s a sub-holy-war about what “business logic” actually means. But I won’t go into that here.

If you fall on the 1st side of the fence, and you feel the desperate desire to embed some logic in the data layer, it belongs in stored procedures, not in triggers. Reasons include maintaintability, discoverability, performance, documentability. Not to mention source-control on procs is a helluva lot easier than on triggers.

Thing 5: Too Many of Them

there's too many of them
Stay on target..

While multiple triggers can be defined for the same action on the same table, that’s not an invitation. You enforce trigger execution order to an extent (first and last), but any more than that and you’re asking for confusion. Falling back on the KISS principle, if you need more than one trigger on a table & action (insert, update, or delete), you probably need to rethink the underlying design.

Bonus

Using INSTEAD OF vs. AFTER: it’s fairly self-explanatory, but just be aware of what you’re doing, especially with the former. You’re literally replacing the desired action of, say, an update query with the contents of your instead of update trigger. If this is not obvious to all users of this table, you’re in for some really surprised faces and angry messages.

And that’s all I have for today folks! Enjoy triggering stuff. In moderation, as all things. =)