I know this, because I am a developer. My heart constantly wants to code up the solution to...well anything. What I have learned over the developing and architecting enterprise software solutions, and as the solo developer of my website project is how this love of code can actually slow down and sometimes halt the development of a project or feature because we get too caught up in the tech, we don't take the time to reflect and solve the actual problem.
How do you fix this habit? Before you start coding up a solution, make sure you understand the problem you are trying to solve. Seems simple enough yet developers (like me) have the habit of jumping right into the code before they even really know what they are trying to solve.
Through my years of experience solving problems with technology, I have a couple of steps I go through to help inform my solution design for problems of a variety of problems. I apply these steps when I am trying to figure out how to integrate two enterprise systems and when I'm trying to figure out the best way to implement a new feature on my website.
The steps are the same, although the effort required will vary.
And I don't mean coding problem.
I mean business problem or real life problem or whatever you want to call it, but it's not a code problem. Never have I ever been asked by a client to "implement a binary tree" or "write a sorting algorithm for sorting an array". That's not to say those aren't problems, but they aren't business problems. These are technical problems, and they are fun to work on...sometimes. 😅
Business problems are the reason clients engage with software developers. The client wants software to fix their problem, and they seem to think that software is the solution. Before you code anything, take a few moments to answer the following about the problem you're preparing to solve with code.
I am not suggesting you second guess the client, but rather try and empathize with your client and really understand why their problem is what it is. This is where you can start to understand whether or not software development fits into the solution to the problem. I have come across this many times, where after revisiting the problem with the client, we found the best solution was a change in their business process rather than adding tools to it.
Let's assume, for the sake of this post, that you see where software can help play a role in solving the problem.
Sounds silly, I know, but doing nothing is always an option and people do it all the time. But why would someone choose to do nothing? Because the risk doesn't outweigh the reward.
By answering this question with your client, you get to understand the risks associated with the problem. This will inform your solution design, as if the risks are high you may want to invest more time and effort into parts of the design than others. It will also give you context on the priority of your solution in the mind of your client.
The last thing I try to do is try and pull any key performance indicators (KPIs) or metrics that will help define success for the solution. I find that most of the time, this is about turning qualitative terms and statements into quantitative ones.
For example, "We need to process these forms faster" should change to something like "We should be able to process at least 100 forms an hour". See the difference?
You are adding clear, measurable, success criteria for your solution. The terms "these forms" and "faster" are too vague to build on. Maybe fast enough to you is 1 form a day, oy maybe 1 form a second. Your client is the expert in their business, so you should ask them so you can understand the goals and potential constraints your solution needs to address.
I know-- your hands are itchy from not coding, but assuming you took the time to understand the problem, the next step is to confirm your new found knowledge. The easiest way to do that is by explaining it to someone else, like your client. If your client agrees you nailed it, you nailed it and now you're ready to startdesigning (not coding) your solution.
One thing that is not uncommon is that your definition of the problem may sound different than the problem your client originally described. This is normal, as you are the technology problem solving expert.
The fact that your definition of the problem differs from your client's isn't necessarily a bad thing either. Many times, I have found that through my problem definition process, the client gains a better understanding of root cause of their problem and their mind will shift from their presumed solution, to something else.
Let me walk you though the process on something not so enterprise-level, but small scale, like a solo-developed website project.
I hit a problem planning the next release of my website where I realized that it was going to be very complicated and cumbersome to add non-blog content to my website, such as the presentation materials from Prairie Dev Con here and here. At this point, here is what we know:
- Client = Me
- Problem = Adding non-blog content to the website is difficult.
Like a good developer, I immediately started down the path of designing a custom application that would automate all the things that make adding content difficult. It was very fun, but after a couple of hours, I caught myself and took a step back and applied my problem definition process.
Let's go through it, and we start by understanding the problem.
It is a problem because I want to continue to add different types of content to the website. The whole purpose of the site is to create a central hub for all my work, almost like a portfolio, but more like a "hub" for all things I create a share. The website is built to handle blog posts or document style content, but when you add more complicated content that is made up of more than just an article or webpage, you need to add links to other data (like files) which is a manual process and is error prone.
In short, it is a problem because maintaining non-article data will be difficult.
You can see in the talks page I have already added some non-article data, which is all currently managed through a JSON file that the website generator pickups and creates pages for. I also needed to upload the files to a public storage host (Azure Blob Storage) and use copy and paste the links into the JSON, which I messed up a few times.
This was my first attempt at "doing nothing" for this problem, and it was difficult. The plan is to add the back catalogue of presentations I have done over the past 10 years (or more probably), which will make that JSON file exceptionally difficult to manage.
When you frame it in the context of risk: doing nothing will very likely result in an massive increase in the number of errors in the data.
If we look at the original problem statement "Adding non-blog content to the website is difficult", we need to translate the term "difficult" into a quantitative one. This would give us a measure to determine how much easier it is to add new content.
Pulling from the answer to question 2, it's really managing the JSON file that makes things difficult. And so I asked myself (the client), what makes managing a JSON file so difficult? There are plenty of tools for that already. And this is where the real problem revealed itself.
The relationships between the data leads to errors. Maintaining these relationships manually is exceptionally difficult, and we only have two relationships so far: presentation to event, and presentation to the presentation materials.
Now that we know the real problem, we can redefine problem:
Problem = The process of manually managing the relationships between content types and data is exceptionally error prone and not scalable.
This updated problem is one that will inform the solution design moving forward. If you want to get specific about the tech needed, we have a very powerful and mature tool that will help solve data relationships: a relational database. How it informs the solution, is a whole other blog post or posts, but at least now we know what we are trying to solve and can use our technical expertise to solve it.
Before you start designing solutions or coding, take the time to clearly define the problem you are working to solve with your client (which can be you, if its your own project). To define the problem, answer these questions first:
Once you have that, redefine the problem by wording it in a way that highlights the root issue to solve, along with the way to measure success. Assuming the client agrees with your redefined problem, you are ready to start using the big, beautiful brain of yours and start solution-ing!
Thanks for playing.
~ DW
When trying to add a key using apt-key
on a Debian 11 docker image, the step seems to run infinitely.
The screenshot below highlights this problem when adding a key that is necessary to validate the mono-complete package.
I setup a DevContainer to build Inky, a interactive fiction editor I like for game projects, without having to install all the build dependencies on my local machine. The Docker container build worked on my Linux machine, but would hang on my Windows 11 box, using Docker Desktop with WSL2. More specifically, it would run forever on the apt-key
command, as specified by the mono install instructions.
If you need an example, take a look at my Inky repository fork at that specific point.
The issue was that the command specifically references port 80 in the URL to the keyserver. In the end, I changed:
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
to
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF
You can see the specifics in the next commit in my example repository in the following commit here.
I was put on the right track with a Stack Overflow post trying to solve a similar issue with apt-key
. Scrolling through the answers, I found this one: LINK
apt-key
Deprecation NoticeIf you look at the Debian documentation for apt-key
or try running the command yourself, you might notice the deprecation warning. Underneath the hood, it runs the appropriate command in Debian 11, but will be gone after Debian 11 and Ubuntu 22.04.
Just something to note for those looking over this solution in the future.
I needed to remove the port number from the keyserver URL used in my apt-key
command.
Thanks for playing.
~ DW
]]>Don't worry-- it's not all feelings. It's data too. All my observed behaviours relate to my projects I worked on throughout the year on GitHub, which provides great insights into my contributions. I'll be using my GitHub contributions for 2022 to highlight the spots where I can identify the behaviour.
Why do this? Because I want to remind myself an others that if you feel like you are stuck, you are better off finding the source of the problem-- even if it makes you face some hard truths. By understanding the root problem, you can work at resolving it, even it it involves changing what you believe is your best approach to work.
At the end of 2021, I started looking at the job market and started to notice that the jobs I wanted (or thought I wanted) relied on skills that I have not been able to practice as part of my day job. Coding is no longer one of my responsibilities, only planning, designing, and providing oversight. This sparked the urge to refresh my skills and prove to myself that I could do for these jobs, and that all I had to do was put in the time.
And so began a series of LeetCode challenges, learning exercises, and review of various problems so that I could skill up and strengthen those coding muscles again. This is what you see in the contribution graph for the first 2-3 months of 2022.
Although this spark eventually faded as it does, I realized something about myself. I realized that it's not the code I love, but the learning about code and how to apply code in various ways. New languages, patterns and practices, solution architecture, whatever-- if it involves coding something, you can count I'll be interested.
This is example highlight Q1 of 2022, yet there are plenty of other times where I spent time learning new tech. Experimenting with Go and Rust as part of my VGL project (more about that later). A brief experiment with Q# back in 2019 early 2020, and my continual urge to learn C/C++ along with the DevOps tools around it. These are all things that have sparked that love of learning code over the past few years, and each time it's the same pattern: spark of interest, dive deep into the learning, burn out because you don't know where to go with this knowledge.
Which brings me to the lesson learned: I need to direct my learning energy towards a goal. This way, when the excitement of learning something new fades I will still have a goal in my sights and continue to channel that energy towards something, rather then letting it fade out.
At the end of the year, I looked at my GitHub Unwrapped video and was surprised by my top languages for 2022.
I was trying to figure out where I had written so much TypeScript, considering that for the past few months I have been living in JavaScript and HTML. Again, going back to my contribution graph I noticed another spike in activity in May.
I remembered that I decided to repurpose my learning strategy, and rather than just doing LeetCode exercises and textbook studying to strengthen my atrophied coding muscles, I would study by building something. Something that I found useful, all while further strengthening my skills! This was the beginning of the Video Game Library or "VGL" project where I spent time building a TypeScript-React project and included some experiments with both Go and Rust to determine which language allowed me to leverage WASM (which was yet another rabbit hole I became excited about).
In the end I shelved the project because I was letting my learning drive the project. This meant anything I wanted to learn, I added scope to the project. In the end, it become too big and my original vision was lost, but the urge to build never fades, only the "something" that I am building.
Looking back beyond 2022-- the idea of building something has always driven me. Building a business, a video game, or a product. It doesn't matter, as long as I am building it.
Where it falls over is when the scope gets too large and overwhelming. This is not uncommon amongst creative types (just ask any game developer) but building something, ideally out of code, is something that drives me. If I can channel that excitement and passion on something I believe is worth it, I think I could produce and finish something I could be proud of.
I started to make this realization about myself and my drive to build things later in the year. This is why I came back to building my website that I had let fall into dormant. I wanted to channel that excitement, energy, and knowledge into something I found valuable. My website is something I have talked about improving for years and started redoing countless times. Looking back at the contribution graph, this represents a large part of the contributions for October, November, and December for 2022. This is further evidenced by the releases of the website I published through the same time period.
Reflecting on my behaviour during the VGL project in May and the website work in the last quarter of the year-- the behaviour and drive was the same. I loved building something, yet the VGL project went onto the shelf, and the website finally managed to get some traction.
The difference was in my approach.
For the Video Game Library project, I let the excitement of learning and drive its development, which led to scope creep and dilution of the original project vision. With the website I took the time to plan and force myself to complete releases-- no matter how small.
This change in approach enabled in a longer focus on a single project. Ultimately, that kept my excitement for my website project going longer and I kept coming back to it over and over again to make small (or sometimes larger) improvements. In fact, I am continuing those improvements today as the website is far from complete-- but it's starting to represent the vision I started.
I have tried sharing and producing content in various forms over the years, but with Prairie Dev Con returning in 2022, I thought I would focus some energy into preparing and share content like I used to in my Microsoft MVP days. This meant lecture-style presentations and blogging.
After three live events in 2022, and almost a blog post a week since mid-October, I realized that I don't love sharing like I used to...rather I only like it. It's a subtle difference, but it is definitely different than it once was.
I like it because it is a practical way to document my work. I love learning and building things, and sharing those things is an easy way to document my progress for others-- but more importantly myself. With the blog posts, I documented things I learned for my website like the Open Graph protocol or my implementation of GDPR compliance. For the presentations, I focused on what I knew and delivered two original sessions; one about my day job and what it is means to be an IT Architect and the other a case study on how to do my day job.
Though this experience this year, I found that I liked the process-- but didn't love it like I used to. To me, the presentations and blog posts were necessary for other outcomes. More specifically, the presentations were my ticket to touch base with other real-life speakers and tech professionals after a multi-year hiatus. The blog posts were my way of documenting, analyzing, and appreciating my own effort into my various side projects.
In the past, with the MVP program, I blogged and shared to receive validation from my peers and the MVP program itself. Those goals are not bad ones by any stretch, but since I don't have the MVP program pushing me, I need something else to help push me. That "something" is myself, and the outcomes I mentioned previously. Personally, I think that means I've grown quite a bit since I was an MVP and is an great example of how 2022 has been a year filled with huge change for myself and my attitude towards work.
I have mentioned the good things, the changed things, and now I will go over the things I need to improve (in my opinion).
Everybody is different and bring different value to the table. I have led a very privileged career and have had massive success in many different areas, yet for years I have rarely taken the time to appreciate those accomplishments.
Instead, I would get caught up in comparing myself to others and what I couldn't do, rather than what I could do. I would dwell on my lack of recent coding experience, rather than celebrate the time I've spent migrating legacy systems into the cloud. I would focus on the jobs I did not qualify for, rather than the ones that I did qualify for.
This cycle of focusing on what is missing is lose-lose situation. There will never be enough success. The grass will always be greener on the other side of fence, no matter how many times I jump over it.
I need to remind myself of this moving forward, and hopefully you can remember that for yourself as well.
People refer to me as "a talker", as in, I like to talk and I'm pretty good at it.
I leverage my talking skill in my day-to-day job, but when it comes what I am trying to build for myself I need to focus on doing the work rather than talking about it.
It might be cliché, but "talk is cheap" and I need to talk less and do more. Plain and simple.
In short, I identified cyclical behaviours and patterns in myself that relate to the work I put into my various side projects and personal (and professional) development. In 2022, I noticed the following about myself:
The first two are my way of channelling creativity, which is why I love them so much. Although I used to love sharing my knowledge, at this point in my career and life, I like it as it is a practical way for me to document things as I discover them and connect with others, rather than as a method to be validated and rewarded.
In terms of how I can improve:
I need to accept and embrace my current skills and abilities, rather than focusing on what I think I am lacking. I also need to focus more on implementing my ideas rather that talking about them. Once I have something built, then I can talk more about it-- but until it's built, I need to focus my energy and excitement on the build rather than the talk.
Thanks for playing.
~ DW
]]>When the call for speakers opens up, you are required to submit a summary of your talk and yourself. I call this the pitch process, as your submission is your moment to convince the event organizers you are worth betting on.
It might sound stressful, but its not. It's a pretty low key process considering you are just filling out a form, and it's low stakes. If you don't make the cut then, you cam try again next time.
The point is that you need take the time to think about wht you're worth the effort, because you are definitely worth it! You know it, so now is your chance to practice.
Once you're accepted, you get a chance to connect with other speakers. These folks are like minded people who are willing to spend their time sharing their experiences and expertise. Sit with people you don't know and have conversations. Introduce yourself. Talk about what you do and listen to what they do. When you're done, find them on LinkedIn and remind them where you met them.
I have met some of the best people this way and have continued to stay connected beyond the conference (shout out to the WesternDevs).
As much as I appreciate livestreaming and virtualized meetings, speaking in the same room as other humans is very different and definitely develops a different set of skills and strengths. The interaction you get with your audience during and after you deliver your session is something I have not been able to replicate in the digitally transformed world we live in today, in 2022.
Just to be clear, something will go wrong...and that's okay.
No matter how much you prep, something will go wrong. A demo will fail, a slide will be out of order, a question will be asked that you don't have the answer to. The key is in how you react and respond to the situation. These "mistakes" are what has made me a better presenter in my day job. It has also helped me learn to stay calm and collected when pressure is being applied.
As a side, I wanted to note that not all conferences are created equally.
Before you submit your session take note on what the conference does to support their speakers. A few questions to ask yourself before you commit your time and effort to a conference:
There are no right or wrong answers to these questions, but you should consider what you're getting out of the deal when you submit sessions to a conference beyond professional development.
Just remember that the speakers are the talent that makes a conference possible. Your work is valuable, and the conference team should ensure you feel appreciated, ome way or another.
Speaking at in-person events, like tech conferences and user groups, is a a great way to grow as a professional. Key benefits are:
Thanks for playing.
~ DW
]]>To be fair, I should highlight that this is definitely a self-induced problem. The Docker Engine prerequisite is listed right on the README for the nektos/act, and had I reviewed the documentation I probably would have saved myself the trouble. Still, in my web sleuthing for solutions to the problem I created for myself, I found others had hit similar issues, hence this post.
I discovered the problem when I attempted to test my GitHub Workflows locally using nektos/act which is a tool I have been using for the past few years in my software development. It does this by pulling down a docker image that simulates the GitHub runner and runs the workflow in that Docker container. I have done this a few times over, so went to one of my older projects where I set this up and pulled in the code to get it running.
Being that this was a fresh Linux install, I had not installed Docker yet. When I searched out the installation instructions for Docker on Linux, I was greeted with this announcement:
I have been using Docker for Desktop on Windows for a while now, and I am always happy to have software that exists across my Windows-Linux development environment ecosystem, and so I went about installing Docker for Desktop as my new Docker install.
After testing my new and shiny Docker (for Desktop) installation with the standard docker run hello-world
, I was ready to get back to coding!
Or so I thought...
This is where things went sideways and the problem appeared. I ran act -j build
to my run my build job
in a workflow I know has worked previously and was greeted with the following error message:
Cannot connect to Docker daemon. Is the docker daemon running?
Not what I expected, considering I just tested out my fresh Docker install, but I tried pulling the image down myself with the docker pull
command just to make sure things didn't break, and everything worked as expected.
With a bit of web sleuthing, I came across others who reported the same issue and noticed this link in particular:
You could check if
/var/run
actually containsdocker.sock
When checking this, I found that docker.sock
was in fact NOT present. I immediate associated it with the Docker for Desktop installation, as that was the only new variable from my previous development environment.
This is part where I waste my time trying to figure out why did Docker for Desktop not install docker.sock. Rather that figuring out how to install the docker components that are missing.
Although I am no Docker expert, my understanding is that Docker for Desktop runs docker inside a VM rather than on the system itself, unlike Docker Engine. In fact, you can see a separate Docker context when you list out the contexts.
.
It should be noted that default context for Docker was listed, even though I had not installed Docker Engine yet. This lead me to believe something I installed was incorrectly configured, but really it was the fact that I had not installed the software I needed.
As technical as I made it sound, the real problem was that I was missing software. Specifically I was missing "docker" on my Linux machine, even though I installed Docker for Desktop. 😊
Well, if the problem is that I am missing software, then the solution must be to install the software. That software is Docker Engine, which sets up the Docker API right on the machine rather than though a VM like Docker for Desktop (as far as I understand it).
In conclusion, install the software dependencies the tools If you're running a Linux distro, as great as Docker for Desktop is-- you may still want to install Docker Engine. You can always switch contexts on where to run your own docker commands with the docker context set
command, but it's worth double checking to make sure the tool you are using supports Docker for Desktop on Linux platforms.
Thanks for playing.
~ DW
]]>Joel did a great session about API first design, which was a very dense session, but he delivered the content in a way that was very approachable and allowed me to think of the benefits of doing API first design with tools like Swagger.io and OpenAPI.
It was great seeing the value of these tools, and hearing about the patterns and practices experienced API developers like Joel use to implement consistent and secure APIs.
I went into this session thinking I was going to be fascinated with the subject, but that the concept would apply only to development leads or possibly coders, rather than an architect like me..
I was wrong.
The Developer Velocity Index (DVI), is a way for any team (even if it is a one-person team, like me on my side projects) can help frame up and scope the abstract problem of figuring out how do to deliver more value.
I plan on applying the DVI to my side project adventures, self-development, and my enterprise day-job efforts as soon as possible.
Although Dave and Lavanya delivered two completely separate sessions related to testing, the content they delivered worked together in a very interesting way.
Dave demonstrated and discussed Playwright and end-to-end testing framework that resolved or improved the problems we commonly see with end-to-end testing. Lavanya demonstrated how someone should apply proper code management and development techniques when creating test code using a framework, like Playwright.
For me, together they demonstrated why the test recorded features of end-to-end frameworks is not the "best approach" to creating tests, but rather it is only the first step.
I feel that these ideas will be seeping into both my day-job and side projects in the very near future.
Adam closed the Prairie Dev Con season with his session, and managed to leave me with a lot of ideas and helped me identify gaps that I have been living with as a developer and as a solution architect.
Ensuring that developers are security-aware is something I didn't realize I have been missing in my own skills, but also should be looking for in the implementation of my solution designs.
Rod delivered a keynote in both Regina and Winnipeg, and each time I walked away with a positive outlook on my own professional and personal growth, but also with the reminder: A Deal Is A Deal.
Sounds simple enough, but in the past I have frequently found myself regretting decisions or deals I had made with myself or others. But, a deal is a deal, and even if you don't like it or regret it, you need to take a moment to learn from it and ensure the next deal is one you won't regret.
In short, there were a lot of good ideas at Prairie Dev Con 2022. These are the ones that stood out to me the most:
Thanks for playing.
~ DW
]]>With my recent adventures with reimplementing my website, I wanted to leverage this on pages and posts, specifically with LinkedIn and it took a little more research to get it working right. So, for the web nerds like me looking to implement OGP on their projects, I wanted to share the resources I found useful to hopefully save them some time in finding the right resources.
ogp.me
I am calling this the specification, or "spec", and it probably the most important resource. The best part about this site is how approachable it is.
There are code snippets, explanations of all the object types and their properties, and its own list of tools (although they differ from the ones I am including on this list).
If you take one thing away from this post for your work with OGP, take this one.
Both Facebook and LinkedIn provide a developer tool to analyze and verify your implemenation of OGP and has the added feature of busting whatever the social networks have cached for the pages you share.
These tools for triaging or assesing publically shared pages, but not so much when it comes to local development. That is where the next tool comes into play.
Available for both Chromium Browsers and Firefox, this web extension allows you simulate what should appears for any page loaded up in your browser.
This tool saved me from having to continually publish the content to a public location for the post inspector, but note that it is just a simulation of what the tool thinks it should appear. It does not replace post inspector or proper testing on the site you are looking to share to.
If you are reading the post, then this one is an obvious one-- but sometimes we (like me) get so caught up on exploring new ways to solve my problem, we forget about the obvious ones.
OGP tags live in the <head>
of your HTML page. If you are unsure why things are not working, make sure you run your browser dev tools of choice and check the <head>
of the document and make sure the OGP tags you are expecting appear where they should be.
It seems simple, but depending on what tool, engine, or framework to output HTML, you may be surprised what shows up.
Open this post on a desktop browser and press the key combination Ctrl + Shift + i
and you should see your browser dev tools pop open for the site.
Read the aproachable spec document. That is the most important part takeaway from my OGP implemenation. It is very approachable and gives you a strong foundation to work from as you use other tools to triage and assess your implementation.
These are the tools I used to implement LinkedIn support, along with my browser dev tools.
Ctrl + Shift + i
on your desktop browserThanks for playing.
~ DW
]]>What I found odd was that all the links and articles I came across seemed to talk about things at a high-level (i.e. defining GDPR) or assumed I was working at a large scale (i.e. enterprise software), but nothing small projects like my personal website.
Still, I managed to draw some of my own conclusions on how to handle GDPR for my personal website and wanted to document them somewhere.
I am not a lawyer, so this is just an opinion from a developer. As a rule of thumb, I avoid taking legal advice from random folks on the internet. If you take advice from this article, take that bit and keep it.
I hope others (like you) use this post to draw your own conclusions or how you want to proceed with your own plan for handling GDPR.
But if you want real advice. Get a lawyer and talk to them.
Yes, it does apply to your personal website if are tracking information about your users and you are developing your own website or application.
I mean developing as it coding it, publishing that code, and hosting it somewhere like Microsoft Azure or GitHub Pages. If you are publishing your own code, GDPR may apply to you.
If you are using a third party tool or platform, like Facebook or LinkedIn to host your blog posts-- you appear to be in the clear. When you use a third-party platform, the platform, not you, is responsible for GDPR compliance.
Even if you think you are clear of GDPR responsibility, make sure that you trust your chosen platform to comply to GDPR and other regulatory bodies out there, as your site depends on it.
The GDPR is all about protecting personal information and giving control back to people navigating the internet. GDPR is not the only set of laws in play, as California, Brazil, and Canada have their own versions of similar legislation, but many of these laws seem to have been inspired by GDPR and why I tend to focus on it.
At the personal website level, you need to consider whether or not you are collecting personal information from your users. This includes things like IP addresses or cookie identifiers.
If you are NOT collecting information like that, you are good to go! Just remember that services like Google Analytics or Disqus Comments use personally identifiable information to operate, so if you have decided to include one of those services on your site then you need to think about GDPR compliance.
I concluded the GDPR-like laws apply to my personal website if I want to do any kind of usage tracking and understand how users are using my site.. This means it needs to be an opt-in policy that gives the user the option to do just that, opt-in.
The dialogue above is the only real visual evidence on the site now. As simple as that looks, a lot of thought went into it prior to implementation. Rather than doing a complete code review, I figured I would share the highlights.
My default would just be to include something like Google Analytics, and be done with it, but with GA being made illegal in the EU and more countries creating their own GDPR-like legislation, I thought I would stay away from it and try something different.
I chose Application Insights and took the time to learn how it handles data privacy and retention and how the JavaScript SDK uses cookies.
Regardless of what you choose for your analytics or tracking tool, the important part is that you understand how the tools are GDPR compliant and how the tracking technology works.
You've seen million of them already, but those cookie banners have purpose. The GDPR website outlines the requirements around using cookies, and many tools use them. The important thing is that you know how your website works, along with all the dependencies you choose to include.
In my case, the cookie banner enables cookies in Application Insights, which in turn enable usage data collection, only if they click "Accept".
This last point is less technical, and more about design. I am designing with transparency in the front of my mind. I added a privacy statement to my about page to explain the "why" around using Application Insights, and will share more specifics and document them accordingly.
GDPR and the various GDPR-like laws definitely apply to you and your personal website or app project if you are building the code yourself, assuming you want to track information about your users.
The short story on this is that you need to draw your own conclusions and take responsibility for what you include in your website. If you are developing something to share outward into the world, you need to take the time to understand how the various tools you are included (such as Google Analytics or Application Insights) as well as the requirements for compliance.
Two resources I found useful in explaining GDPR requirements are provided on the site GDPR.eu. If you are looking for more information, I definitely suggest checking out these links:
Thanks for playing.
~ DW
]]>Being a solution architect during the day, I wanted to apply some of my new found skills (and appreciation) for documentation while working on v10 of my website. Ultimately, documentation is necessary, even on personal projects. If I think back to my own experience with my own projects, they can end up sitting on the shelve for a long time. When go back to revisit it, other than analyzing my own code (on prototype stuff) can be a serious time sink to even get things in a running state without any decent documentation.
And I do mean needed not wanted. Everyone wants documentation of all kinds, but what does an audience of one (i.e., your future self) need to get the project back off the shelve and into working order?
Like any good solution architect, I started to read, learn, and figure out what others consider "enough documentation" or "good documentation". I also spent time defining the problem I needed the documentation to solve, and landed on the following docs being "enough".
It might seem obvious but, I have read enough of my own empty or default README.md
files to know that this is easily the most important piece of documentation you write. Without it, the project will require code analysis to figure out what it actually is, and that is never good.
There are a lot of great examples README.md
files on GitHub to look at, but I would suggest you start simple if you're just getting off the ground. My take was to include system requirements and the steps to setup, build, and start the project for the developer.
When searching for info on this, I really like this article from Hillary Nyakundi provided a great "how-to" on making a good README.md
.
This is one I picked up from my day job being a Solution Architect in a large enterprise. Decisions you make along the way need to be documented, even if it is only for yourself.
The idea is to document decisions that will have a long term impact on your project. Decided to document decisions? That can be documented. Decided that you only want your project to run on Azure? That can be documented. Decided to design your solution around a specific pattern? That can be documented.
You can document as many or as few decisions as you want. In the case of my website project, I documented a few core decisions early on because I wanted to remember why I built it this way. Even though I am adding content regularly and tweaking features frequently enough, I could shelve the development at any point.
In terms of format, there are plenty of ways to document decisions and why it is important, but I am not going to spend time explaining that. Instead I would recommend reading how GitHub documents decisions. That is where I started, and they have a great breakdown of the different format and tools that can support you, if you're inclined to get into the tooling.
For the website, I decided to use MADR as my decision document template and documented "why" I chose it as the first decision for the project and documented it.
The last bit of documentation I feel I need (although it is not as important as the previous two) are solution diagrams.
Just like decision documentation, this is something that can take a lot of different forms. Personally, I am not a huge fan of diving into UML or any of the traditional diagram styles. I like diagrams that present well to multiple audiences and explain one thing well.
The above diagram is one I created to explain how I setup all the pieces inside of Microsoft Azure to host my website. The diagram answers the question "What is necessary to host your application?" which goes beyond the code in my case.
There is no real format that I applied here, but I scoped it to focus on the Azure Infrastructure and service I needed to rebuild the solution in Azure from scratch. Almost like a high-level guide to explain all the different pieces that need to be setup and handled.
In regards to diagram formatting, although I did not use it in this example, the C4 model is something I have been messing around with to describe systems and projects in my day job. If you need a little direction, or are struggling to figure out "how to diagram" your project, it might be worth a look.
CHANGELOG.md
I wanted to highlight this, but also point out that it is definitely not required. A CHANGELOG.md
allows you to document your progress.
I based my CHANGELOG file off of the format described at keepachangelog.com. It forced me to take a bit of time (really, like 15 minutes or so) to reflect on my effort and appreciate the effort I have put into the project. Plus, it tells the story of how the project has evolved over time; which, just like the decisions, provides context on how things got to where they are.
In short, the documentation I need (not want) consists of the following, with the following priority:
README.md
(that at least says how to setup, build, and run the project)CHANGELOG.md
(not required, but provides more context and forces you to appreciate the effort you have put into your project)Thanks for playing.
~ DW
]]>Before I forked, I thought I forking wasn't for me. I thought, I am too old to fork, but man oh man was I wrong. But then came the day where a library I was using was missing a critical feature, and a quick search through the repository issues found that others were looking for that feature too.
This is the moment where I got to choose. I had a options for my next move:
The first choice makes sense if you don't have the knowledge or skills.
The second choice feels easier, but that is only your fear of contributing getting the best of you. When you add to your codebase, you are adding more code to support in the long run and all comes with that.
The last choice might make you nervous of you haven't forked in a long time, but I assure you, if you can code, you can fork. So browse through the code and see if you can find the spot your forking can help.
This is very subjective, but when it comes to forkable projects for me, I look for the following things, in this order:
CONTRIBUTING.md
, to give me a breakdown on how the community wants people to contributeIn my recent contribution to markdown-it-eleventy-img, I went through the repo trying to figure out whether or not it was forkable. Although I didn't find a CONTRIBUTING.md
(but that could be a future PR) but I found a set of tests, and even though I forgot in the moment, there was an existing issue from someone else about the same issue I was hoping to contribute!
And with that, I knew this project was forkable. So I pulled out my finger and clicked "FORK" like boss and coded up my solution, and submitted a PR.
If you look through the thread of the PR you'll see that my solution went through a few iterations and changes after receiving feedback from the project owner.
This was a great conversation and it lead to a better solution implementation than my original submission, which made me exceptionally happy (and proud) of my contribution.
Even though it is volunteer labour, remember that both you AND the project owner/admins are choosing to spend their time reviewing and analyzing your work. Everyone is involved in the fork is investing time, and everyone should be treated with respect and as a professional.
Plus—this is a great opportunity to level-up your development soft skills. Enjoy yourself, but be timely and respect the investment everyone is making.
To fork like a boss, all you need is a project ready for contributions, some confidence, and respect for others on the project:
CONTRIBUTING.md
Thanks for playing.
~ DW
]]>Okay fine. Maybe I can't teach your dog quantum physics, but this book taught me something about quantum so that's something, right?
I finished How to Teach Physics to your Dog by Chad Orzel as I continue to dive deeper into my quest in understanding in quantum computing. This book was recommended by Mark Russonovich at the end of his Microsoft Ignite 2019 talked called Quantum computing: Computing with a probabilistic universe with Mark Russinovich at Microsoft Ignite 2019. You can watch at this link at the Microsoft Ignite site: LINK.
In short, it's pretty good.
It's solid way to understand some of the core of quantum mechanics that make quantum computing possible. It goes deeper just defining concepts like superposition, entanglement, and QED, but it also gets into a bit of the math and the history side of the concepts.
Now, I'm no book scientist and as such, I'm not going to worry about trying to put together a fancy book review that rates my experience. Instead, I thought it would be a good idea to highlight the parts of the book that made it a good read, and you can figure out the rest on your own.
Orzel jumps between explaining a concept traditionally to having a conversation with his dog (hense the title).
The style is kinda silly, which is something of a fresh take for explaining science, but it works surprisingly well.
Outside of helping to simplify some of the more complicated or mind bending points of quantum mechanics, or lightens the mood and makes it less of a "physics book" and more of a "story about physics".
I think the best part about the book is how it builds on each topic, chapter by chapter.
It reads like a story, where each chapter prepares you for the next until you finally hit the big wrap up on entanglement and QED, or quantum electrodynamics, along with a chapter on what to look for when people are trying to abuse quantum physics to push their own non-scientific agenda of making a quick buck.
Physics is math. That's just how it is, and this book does a great job of introducing us to the math, but not dwelling or depending on it. I appreciated that, especially near the end of the book where ideas like QED start to really bend your mind.
I appreciated how they didn't shy away from it and used the conversations with the dog to help bring it down to a "not too mathy" way. Keeps the reader in check and reminds them that this isn't just philosophy, but real science.
This book set me up with a solid foundation on the physics that make quantum computing possible, which was the whole reason for reading it. It's "learning through story and conversation" approach that doesn't shy away from maths (but doesn't dwell on them) makes it an easier read than one might think.
Recommended for those interesting in understanding the science (not magic) of quantum physics that make quantum computing a real life thing.
]]>That's not to say the content should provide value to the reader, also known as you, but it's still a very important question to answer.
With the reboot of my blog and getting back into social media, this has become the first question I ask myself before writing a post, or planning a livestream, or scripting a video, or even saving a link. It's selfish, but if the post isn't valuable to me, then why would it be valuable to you?
In my previous professional life, the content I created with exclusively for the readers and for Microsoft. I made content, they awarded me an MVP award with some great perks, an awesome community, but it lead to an unforeseen need to be validated and rewarded for my content, which I wrote about recently LINK. It was a good deal and made sense at the time. But that was then, and this is now. Now when I look at creating content, first person it needs to bring value to is me.
I didn't.
At least I didn't for a few years and just left my blog and website to be a development science experiment. Reason being, I didn't know the value the blog could bring me.
I'll leave the numerous self-driven arguments and half baked reasons I tried to give myself out of it, but ultimately it didn't make a lot of sense to keep on blogging.
Until it did again.
Even though I stopped blogging and stopped doing my annual self retrospectives (like this one from 2014: LINK and this one from 2015 LINK) the self-reflective process never stopped. Rather, my the self-retrospectives evolved into smaller chunks of thought that I would or share with trusted friends or family to get opinions on the deep thoughts from this inner monologue I maintained.
Then a pandemic started and I was unable to share, at least not with the frequency and ease that I used to.
It's the inability to share the way I was used to got me thinking differently. I started to realize that sharing was my way to get approval and a pat on the head for an idea. I didn't need to take action with my website or my social media presence, because I already had a bunch of people tell me it was a good idea. Why bother doing it when I already got it validated?
And so I tested my theory and stopped sharing my ideas on social media. Not long after that, I stopped sharing my ideas with my trusted friends virtually (unless I had something to show, which I never did) not because I didn't want to, but because I needed to learn to do this for myself.
For me.
This means I am the first validator of the idea, and ultimately gives me that first bit of validation to approve my time investment into it. Of course the catch is, if I want futher validation, I suppose I should validate my own feelings first to make sure I'm right.
Time is my most important asset. For that reason, the first question I ask the question: "What value does
Because if it brings me value, then there it will probably bring value to someone else out there on the internet.
Thanks for playing.
~ DW
Photo by Rob Schreckhise on Unsplash
Photo by DrahomÃr Posteby-Mach on Unsplash
]]>I'm a millennial. Which means, I've been doing social media since before it was called "social media". I got onto Twitter very early on, I needed a university email address to create my Facebook account, and YouTube...well, it wasn't owned by Google.
The point being is that social media has been a part of my entire adult life. I actually can't rememberof a time in my professional life that someone wasn't telling me that my social media presence or "brand" had the ability to propel my career forward, if I played my tweets just right.
It is probably that every job I had in tech was surrounded by marketing people, but I developed this bizarre obsession over "my brand" or "persona" that I portrayed to the online world. For years I have gone through exercises about how to build followers, read the analytics, all on a mission to appear professional.
It wasn't a fruitless venture. I'm pretty sure my blog and Twitter account secured my many MVP awards. What I wasn't expecting was the dependency this obsession created, which was this weird addiction to "likes".
You see, the MVP program was how I validated doing all the extra work on keeping up with technology. I love tech, I really really do, but had someone not pointed me in the direction of building a brand around JavaScript and/or Front-End Web Dev, I probably never would have gone down that road. As I built up that persona, th more likes and engagement I got, and the more likely the MVP Program would notice me. Eventually they did notice me and TA-DA, I became a Microsoft MVP.
The MVP Award was where I think this all started. It was a reward for being so...professional or knowledgable or hard-working in my B-time or whatever. I loved it and somehow rationalized that people get directly rewarded for their side efforts. I suppose that is sort of true, depending on how to define the term "reward", but in general I don't think it's as big of a perk as the MVP Award and all the benefits that come with it.
As the years went by, I started to expect that sort of reward for my effort and equated with validation and started to need it in order feel like I was succeeding as a technology professional.
Then, I decided to let go of the MVP Award and chase my dreams.
This is the part in the story where I started chase my game development hopes and dreams. It started out well, but not long after doing some game development streams on Mixer (yeah, remember Mixer?) and some blog posts, I started to feel uneasy about my ability and my "success" as a technology professional.
Building an expertise takes time and effort. It takes even more time and effort when you day job doesn't care or need that expertise, and you have a new family to take care of. I kept getting caught up on how long it would take for me to "become a professional" or whatever. I kept checking my different social media analytics and started focusing a lot of time and energy on making game development content rather than actual games.
I resarched marketing techniques, read social media management guides, and started learning how to promote my "dream game" before I had even really done anything other than a couple of game jams. I checked the "likes" multiple times a day and tried to figure out how to maximize the reach of my content, continuing to get in the way of building an actual commercial video game, but searching (somewhat desperately) for that acknowledgement through likes, thumbs up, post engagement, and views.
It kept coming and going, but it would always block my progress on whatever project I was working on. My game jams were about the content I produced, not the game itself. After a jam, I would share and talk about "the next steps" and all the planning I was doing instead of actually doing something with the project. No matter how much time I spent, there wasn't enough to both "share to the community" and build a game.
It was an old habit that needed to go away, and so earlier this year I just stopped sharing on social media.
That break was supposed last about month. That was about three months ago.
In my three month break, I looked inward and thought about what I've done with social media over...well, most of my professional life. I've decided that it's time to start figuring out how or if I should return to the social networks, but I'm taking it slow and flipping the script on my social media shares.
Rather than measuring my successwith likes and views, I'm looking at the social platforms as ways for me to grow personally and professionally. I'm asking myself two questions:
Does v7 of my website coincide with this? It sure does.
I'll elaborate further another day, but just writing this post helps me reflect on my own story. It feels honest and healthy to write all this down and I'm creating content for both me and the readers. I share not only because I crave validation, but also think that my sharing my experience might help others learn something.
This is site and blog is the start. It has a purpose for both my personal and professional growth and so it is alive again.
I don't think so. I have analytics enabled on the site, but I have purpose for this: to learn. More specifically, I want to learn about what analytics can teach me about my audience. It's not just about views and the likes, a but what the readers (and players) are telling me through their engagement.
Plus-- whether I like it or not, analytics plays a critical role in decision making these days. I see it in my day job, and I see it in game development. Either way, it's probably having some literacy around the different kinds of analytics out there can't hurt me.
The trick is not wrapping success around the metrics.
I think it's going to be YouTube and other video content like livestreams. I really enjoy making my little movies, and with the pandemic in full swing, it's hampered my ability to practice my presentation skills at conferences with an audience. Between platforms like YouTube, LinkedIn, Twitch , and even Discord, there are some good opportunities to sharpen my video presentation skills.
If I'm being honest, I can't see Twitter or Facebook making a comeback in my day-to-day life. Possibly a place to echo posts or share activity, but I'm just not feeling the "hot takes" nature that comes with Twitter and Facebook. As for Instagram...I'm still undecided. I don't have a lot of pictures to share behind the scenes, but again-- never say never.
Regardless of where I share content, the website will be hub and the question will the same: What does sharing do to help me grow personally and/or professionally?
Thanks for playing. ~ DW
Photo Credit
]]>I've been thinking a lot about that lately, and the reason I stopped was that I didn't have a reason to do it anymore.
I started blogging because I was told that it was one of the things I should be doing and had to keep doing to get (and stay) in the Microsoft MVP program as a Front End Web Development MVP (formerly known as the Internet Explorer MVP program). Once I decided to leave the Front End Web Dev stuff behind to make way for my passion for video games and stuff, why should I bother blogging?
And so that was that. I stopped blogging and made way for all the video game development effort I could muster! Sure, there was the occassional link to a YouTube video or a little tech problem that I've solved on the side, but for the most part, it was dead.
0 commercial games later, and a few false restarts I'm back on the blog and the website...again.
Well, I stopped because I gave up my reason to do so. I am starting because I found a reason: I want to share and learn.
Let's break that down a little:
That is probably the most important part of the whole reason. I want to make content. I'm feeling the need to do it. Maybe its habit or nostalgia for the days where creating content was part of keeping my professional status, either way it's something I like to do and the blog is a lightweight and easy way to do that.
Thinking back to when I blogged regularly as part of my MVP contribution, I didn't realize how much of a platform I had to share stories, advice, how-tos and whatever else. It really was quite the reach, and quite a privilege. My experience and work has changed a lot, but there is a still a lot to share for others.
The big difference this time around is that I'm not worried about whether or not it fits "my career goals" or my "professional focus". It's really about me putting my thoughts together in a cohesive way that might help someone reflect and make a decision. In getting v7 of the website ready I was that "someone" and looking over all my old posts confirmed that this is a good idea.
But, even if you (or me) don't find my posts helpful in the long or short term, that's okay. It feels good to get a post out there. There's a gratifying feeling that comes with putting a post together that helps me, and that makes it worth it too.
As cliché as it sounds, I'm a life long learner. Content creation and management is something that keeps coming up in my side projects, and yet I've never taken the time to properly learn and understand how to do it. It's not about the marketing side, but rather what it means to contribute, learn, and engage with your audience as a solo content creator.
Plus, there are these weird "little problems" I've always had in my years of creating content with with MVP program. I'm hoping that with fresh eyes and new experiences with me, I can tackle these problems with a different perspective that I have in the past.
Ultimately, there is a lot to unpack here-- but assuming I keep this going, I'll continue to share what I learn, which will lead to more sharing and then more learning and...you see how this is good thing? :)
Scott Hanselman shared this idea (reference) of valuing your effort in helping through your keystrokes (I'm greatly paraphrasing the idea). I spend time helping others one-on-one, but it's usually the same stories, ideas, thought patterns, and so on that people find helpful. This website and blog is my chance to share new things and old with a fresh perspective, one that is owned and driven by my values and ideas and not those of my employer or community.
Maybe its selfish to think this is a good idea, but that's fine because it's my website, my blog, and my idea that I think is good.
And if I think it's good, then that's a start.
Thanks for playing. ~ DW
]]>Travis CI announced a new pricing model that could have impact on open source projects that are using Travis for continuous integration and/or deployment. For static websites, like the Western Devs website or personal website, this could result in getting some unforeseen costs. With that in mind, we decided to take the plunge an migrate away from Travis and over to GitHub Actions as they provide CI and CD workflows free for open source projects.
Fine. Here is is. It is open source after all.
But just to be clear, this isn't a tutorial on how to code this up, rather its a walkthrough on what it took to get our Hexo based static site from Travis to GitHub Actions.
And I mean workflow and not just the build.
For the Western Devs, our workflow goes like this:
GitHub workflow provides everything we need to do this, and I'll walk you through the code, which you can see for yourself in here in our GitHub repo.
This is our trigger to start the workflow. That is represented by the on
section of the YAML. In our case, we want to trigger the workflow every time there is a pull request created for the master branch, a push to the master branch (i.e. a merge), or a push to any other feature (ft) or hotfix (hf) branches.
1 |
|
Now we have a workflow that will trigger when we want to. Next, we need to actually build the website.
Our build is exceptionally simple-- just generate the site, and if the generation is successful, the build was successful. To do this, we create a build
job that handles the work.
1 |
|
The first two steps are using GitHub Actions provided by GitHub themselves. This pulls our source code and the sets up the Node environment that we need to build the website. Once that is done, we run
steps to run shell commands to install our project specific dependendies and run the build script itself.
The scripts have been defined in our project package.json
file and are used by the developers to build the site locally as well.
If we are talking about the master branch, we want to do a deployment if it is successful. For this step, we added a conditional expression using the github
context that is provided to all actions. You can learn more about context and expressions for GitHub Actions in the GitHub Docs here.
You might also see that were using an encrypted secret using the secret.GITHUB_TOKEN
expression. All repositories have this feature in the settings section of the repo, and you can learn more about creating encrypted secrets for a repository here in the GitHub docs.
1 | - name: Deploy to GitHub Pages |
In our case, our deployment target is GitHub Pages which provides free hosting and SSL certificates for open source static sites sites like ours.
We decided to take this opportunity to consilate everything under the GitHub umbrella because it saved us a couple of bucks, and now everything we need to manage the site is in one spot rather than spread across multiple cloud services.
Originally, we had forgotten this step and started to feel it right away. So an issue was created and I put a solution in place in about 15 minutes, thanks to someone else doing all the heavy lifting and publishing their work to the GitHub Actions Marketplace.
Slack supports incoming webhooks, even for for free workspaces. I set that up by following the Slack documentation, created another secret in our repository and voila, we were back in business wih the notifications.
1 | - name: Notify Slack |
The combination of GitHub Actions and GitHub Pages provides every developer with the opportunity to get a taste of DevOps while actually producing something they can show off to their peers and community. Travis CI is, and will continue to be, a great CI/CD solution for developers...but if you're looking for a one-stop-shop for source control, workflow, and hosting. You can't really go wrong with GitHub.
]]>Sure you do! Head over to my davidwesst.itch.io/out-the-door to give it a whirl in your browser (no install needed) or Windows! It's totally free, and feedback is always appreciated.
]]>What was that? You wanted to know where to find and play my games?! Well then, if you're inclined to try some of my games (and hopefully leave some feedback), here they are:
In this video, DW walks through the new features rolled out both in-game and behind the scenes for his LD46 game jam title.
You can play the game here on Itch.io.
This release is an important one for me.
First, off it's the first "beta" release which I've categorized as a moderately stable release, and includes a "complete gameplay loop" on purpose. There are still plenty of bugs (as the video even showed) but it works and playable.
Second, this release is the original vision of what I pictured the gamejam submission to be when D'Arcy and I came up with the idea back in April. Many months later, I have that release which says a lot about my prototyping and experimenting process (i.e. I'm too slow).
Lastly, I have multiple versions of the game out there including Linux/X11 and Windows versions. There's still a lot more to learn and do with the while devops setup for my projects, but this is a great step forward and can be reused with all my Godot-based projects moving forward.
Until the next one-- thanks for playing.
~ DW
]]>GitHub is a social development platform that will make your game development journey easier even if you're not a coder! We'll cover how GitHub can help keep your file history, through to how it can help organize your work, and even where to find cool projects to learn (and possibly contribute to).
https://help.github.com/en/desktop
Tiled | https://github.com/bjorn/tiledInky | https://github.com/inkle/inkyOpenTDD | https://github.com/OpenTTD/OpenTTDMicropolisJS | https://github.com/graememcc/micropolisJS
]]>