Wednesday, 28 January 2026

A light-weight way to solve heavy data problems

Most people want more from their data, but knowing how best to use it can be difficult. There’s always a question mark over whether the development effort will have a big enough pay-back. Smart leaders know that everything has a cost – in management attention and budget.

But doing nothing is not the answer. Data-driven organisations are out-performing those less skilled in data analysis. So, what’s the answer? 

At Anatec AI we are big fans of the proof of concept or prototyping. The idea is to do a relatively small amount of work that does a lot of heavy lifting. It might focus on analysing a limited amount of data to assess the data quality. Or it might be to test out a theory on how two data sources could identify patterns. Or it might be a difficult interface that absolutely has to work for the solution to be a success. Each project will be different, because each problem is different. The approach, however, is consistent: do a small amount of high-risk work that tells you whether or not you are on track. 

The cloud provides a great way of doing light-weight small projects – you don’t need to buy servers or install operating systems. You don’t need to commit up-front. You can just use the cloud’s in-built flexibility to do just as much as you need to do, then decide what to do next. This is a lot safer, and easier to justify than ambitious projects that can end in mediocre results.

Azure containers are a great fit for that way of working. They are reliable, whether running locally or in the cloud. And they can securely access data that’s held in databases, cloud storage, queues or APIs. Of course, containers are just one example of a light-weight approach to our small-project ethos, but they are a good example of how smart Azure technology can produce big results at a small cost. 

The Anatec AI design approach underpins everything we do when architecting modern cloud systems. Our solutions need to be reliable, portable, and consistent across environments - and developed one bit-sized step at a time. Because our clients need to be in charge of the direction of the project and the budget, not the technology.

Do you want to get more from your data without breaking the bank? As qualified Microsoft developers, we’ve got a whole toolbox full of Azure tools to engineer data-driven solutions. If you’ve got an idea for your data that might be a game-changer, get in touch to find out more about how we work.

Keywords: Azure, containers, proof-of-concept, cloud native, data analytics, Azure AI

Tuesday, 20 January 2026

Three steps to optimising learning at work

Whilst everyone agrees that learning at work matters, not all organisations are good at knowing whether their learning programs actually work. That’s because optimising learning is messy. It needs data from multiple systems, departments, and stakeholders.

Historically, L&D hasn’t rushed toward that complexity. But that’s changing.

Three things have shifted the landscape:

  1. Learning management systems (LMS) are now widespread, and they hold large amounts of learning data.
  2. Cloud platforms enable data to be combined and analysed without the need for huge data warehouses. That makes learning analytics projects much more cost effective.
  3. AI tools are available off the shelf and viable for small, focused experiments.

The result is that learning analytics is no longer theoretical or reserved for large enterprises. It’s affordable, practical, and increasingly hard to ignore.

So where do you start?

What is learning analytics?

It’s always useful to start with a definition, and learning analytics is pretty straight forward:

Learning analytics improves learning outcomes by analysing data.

It may not be complicated, but it’s still useful. And it also gives you the correct order to do things - an order many organisations still get wrong.

Step 1: Define what “better” looks like

You don’t start with data. You start with outcomes. What are you actually trying to improve?

  • Fewer errors?
  • Higher sales?
  • Faster onboarding?
  • New skills that are being used in day-to-day work?

There is no universal answer. It will depend on the business context and the problems you’re trying to solve. But until you can clearly describe what success looks like, analytics will only produce dashboards, not insight.

Step 2: Decide how you’ll measure improvement

If you can’t agree on a measure, you can’t tell whether learning made any difference.

Choose metrics that reflect the outcome you care about, and that you can actually measure. That might include:

  • Knowledge gained
  • Reduced error rate
  • Skills demonstrated
  • Business performance indicators

Your metrics don’t need to be perfect they just need to be useful. The goal is better decision-making, based on data.

Step 3: Identify the data that supports those metrics

Most organisations already collect far more data than they use: LMS activity, course attendance, assessment results, feedback surveys, HR data, performance metrics. The problem isn’t that the data doesn’t exist, it’s that it’s siloed. 

When data from multiple systems is brought together, patterns start to emerge. Training volume alongside retention. Learning pathways alongside performance. Timing of learning alongside error rates.

Putting it into practice

This approach works best when treated as iterative. Small proof-of-concept projects using modern business intelligence tools and AI can deliver fast feedback on existing learning initiatives.

You don’t need a multi-year transformation. You need a focused question, a clear metric, and enough data to test whether your learning is actually helping.

If you are looking at getting more from your learning data, and improving learning outcomes, get in touch for a no-obligation chat. 

Keywords: learning analytics, data analytics, learning management system, LMS, Power BI, L&D dashboard, Azure AI

Wednesday, 6 August 2025

Unlocking the ROI of Learning

Or how to free your data from the LMS

Relevant to: HR, L&D, Operations, Finance

You’re under pressure to show the impact of learning. You need to prove that training links to performance, that learning programmes improve retention, and that investment in learning is moving the profitability needle.

But when you go looking for answers, all roads lead to your learning management system (LMS) — and stop there.

You know the data is in there somewhere. Completion rates, assessment scores, time spent learning, department-level engagement — all the raw material for powerful insights. But what you get instead are clunky exports, static reports, and dashboards that don’t speak the language of the business.

If this sounds familiar, you're not alone.

You’ve got the data!

You know you’ve got the data, that’s why you have an LMS. The bad news is that it’s not always easy to get at – particularly if your LMS is software as a service (SaaS).

If you have found your valuable learning data locked in your LMS, I feel your pain. This wasn’t what you were expecting.

Using your data



There are different ways to use data stored in your LMS. For many, the standard reports are more than enough. But for those that want to analyse learning data together with other data, or do more in-depth analysis, you have to get data out of your LMS and into a reporting data store. Here’s how that might work:

  1. API extraction – most modern LMS platforms provide RESTful APIs. With proper authentication (usually via OAuth 2.0 or an API key) you can programmatically extract learning records.
  2. Data pipeline & transformation – Data is ingested into a reporting database such as Azure SQL Database, where it's cleaned, normalized, and enriched with metadata.
  3. Semantic modelling – Using tools like Power BI, you can then build a semantic layer that defines business terms — e.g., "active learner", "average time to completion", "learning impact score".
  4. Dynamic dashboards – These models power interactive visuals, filterable by time, team, location, or training programme — and update in real time or on a schedule.

If you are wondering why you need to store, check and clean data in yet another data store, I talk about that here: Can you trust your data? The bottom line is that your data needs to be clean, up to date, and readily available for analytics purposes. 

Power BI is L&D’s new best friend

With this approach, you’re no longer limited by your LMS’s front end. You get full control of your learning data — and the power to connect it to performance or finance data for deeper insights.

So, although you may sometimes feel that your data is locked in your LMS, there are ways to get at it and analyse it in friendly tools like Power BI.

Here at Anatec AI we’ve been working with data, interfaces, APIs, and learning systems for many years. So, we are well placed to help if you need it. And we can help with Power BI dashboards, scorecards, DAX queries and semantic models. There’s nothing we’d like more. 

If you have a question or want to chat about any challenges you’re facing, get in touch.

Keywords: learning analytics, data analytics, learning management system, LMS, Power BI, L&D dashboard, L&D scorecard

Tuesday, 5 August 2025

Can you trust your data?

If you’re relying on data to make decisions, here’s a question for you:

Can you actually trust your data?

Bad data leads to bad decisions. And if you’ve ever tried to build a Power BI dashboard using unprepared data… you know what I'm talking about. That’s because good analytics starts before the dashboard—with trustworthy data.

So, let’s talk about what makes data trustworthy—and how to get there.

The 3 C’s of trustworthy data

To be useful, your data needs to be:

Correct

Current

Constant

These three qualities are the backbone of any solid reporting system. Miss one, and your insights could be misleading at best—or flat-out wrong.

1. Correct data: accuracy isn’t optional

Data errors creep in more easily than you think. A common name mix-up might credit someone with attending a course they never showed up for. Or a stock item might never be recorded because “we were going to use it straight away.”

Sound familiar?

Then you’ve got things like:

Null values

Duplicates

Outliers

Inconsistent fields between systems

All of these distort the truth your analytics are supposed to reveal. Cleaning and validating your data isn't optional—it's foundational.

Question for you:

What’s the most unexpected data error you've ever uncovered?

2. Current data: how fresh is “fresh enough”?

Everyone has a different definition of “up-to-date.”

For a factory floor, real-time data might be essential.

For HR reports, yesterday’s numbers might do just fine.

But what matters most is transparency—do you know how current your data is, and can you trust that timestamp?

3. Constant data: reliable, available, and secure

Once your data is cleaned and verified, you need to keep it:

Securely stored

Regularly backed up

Available wherever it’s needed

You don’t want your cleaned dataset disappearing on a lost laptop or overwritten by mistake. Constancy means your data is dependable and accessible, day in and day out.

Choosing the right data platform: why Azure?

The Azure data platform gives you flexible, scalable ways to store your analytics-ready data, depending on your requirements:

Azure SQL Database

Great for datasets up to a few terabytes (TB)

Geo-redundant and cost-effective

Easily scalable up or down

Supports Medallion architecture using schemas or databases

(If you’re curious about that approach, I wrote more about it here.)

Microsoft Fabric

Ideal for high-performance analytics at scale

Better suited for large volumes of data

Higher cost, but better performance

Also supports the Medallion architecture.

Once your data is in the cloud, everything gets easier—from sharing semantic models to boosting Power BI performance.

Don’t skip the foundations

Data visualisation tools like Power BI are only as strong as the data underneath them. Trustworthy data isn't just clean—it's correct, current, and constant.

So, here's a challenge:

What’s your biggest headache when it comes to data quality or reporting?

Drop it in the comments—I’d love to hear what you're wrestling with (and maybe swap ideas on how to fix it). And as always, if you want to talk about data quality, you can get in touch here.


Anatec AI has worked with data quality issues for many years. We focus on helping companies make better use of their data to improve their performance and resilience.

Key words: reporting, analytics, data quality, Power BI, Microsoft Fabric, dashboard design


Tuesday, 29 July 2025

Learning data is valuable data

Learning data may be the most underused strategic asset in your company. The world’s most powerful companies are increasingly evidence-led, yet learning and development (L&D) is often under-represented when it comes to data analytics. Right now, it has never been more important to nurture and retain good people. AI tools such as ChatGPT can assist skilled and knowledgeable people; but they don’t replace them. 

Blended learning - messy data

The move from classroom training to on-line or blended learning has reduced costs and facilitates less time away from work. But it has made learning a lot harder to track. Micro-learning results can be difficult to evaluate when you want a strategic view of your learning.  E-learning, YouTube, and MOOC courses are all more complex than pure classroom training when it comes to managing data. 

The rise of the learning management system

Learning management systems (LMS) make it possible to centralise learning data. An LMS not only enables you to distribute your own learning content but also track other learning in different formats. The only challenge is to turn this valuable asset into actionable insights. 

LMS data and Power BI 

Whilst every LMS has a variety of reports, there’s no replacement for the ability to interrogate data in analytics tools such as Microsoft Power BI. The ability to visualise the data in different ways, add multiple “slicers”, and use AI to find patterns and trends is tremendously powerful. Plus, you can match your learning data with data from other sources such as performance data, HR retention data etc. This flexibility can unlock real insight into what your learning programmes are achieving. Power BI can help you to make sense of a more complicated world of learning, massively improving your chances of finding patterns amongst the noise. 

How Anatec AI can help

If you want to do more with your LMS data and are stuck somewhere along the way – we can help. There are multiple ways of getting data from your LMS and into Power BI, depending on your needs. We can design data interfaces and help you choose the right way to store reporting data. We build semantic models that focus on your business needs and can help you make better use of Power BI AI capabilities.

Whether you’ve got a simple question, or an analytics project you need help with, get in touch to see if we can help. We’d love to hear from you. 


Anatec AI has worked in the learning and development space for many years. We designed and built two custom learning systems before LMS’s were even a thing. Now we focus on helping companies make better use of their data to improve their performance and resilience.

Key words: learning analytics, LMS data, Power BI for L&D, evaluating learning impact