Tracking Table Structure Changes
In a previous post, I wrote about exporting local table data to text files. We don't do that for every local table. If changes to the table data could impact the functionality of our program, we want that data included in version control. This goes back to my Golden Rule of Version Control:
Anything that can lead to a bug in your software belongs in version control.
Another potential source of bugs is changes to the structures of our tables. This includes both local tables and linked tables that reside in our back-end. Even changes to the structure of local temporary tables could cause problems if our code relies on that structure.
Exporting table structure
In my script, I loop through the TableDefs
collection to process each table in the front-end file. I skip over system tables and hidden tables. I do this to cut back on version control "noise."
For each TableDef
, I loop through the .Fields
and .Indexes
and export information about them to the text file. Each table gets its own text file.
When I first wrote this script, I exported all the properties for every object (TableDef
, Field
, and Index
). It was the same approach that I use when exporting database properties. The problem this time is that it was all a lot of oysters, but no pearls.
I took a different approach and hand-picked only those properties that I found most meaningful. You may want to review the properties yourself using the techniques in the article linked above to see if you want to include different properties in your export. The ones in my script below have served me well for many years, so it's at least a good starting point.
Exporting table locations
It's important to know where our linked tables are actually linking to. Each table definition in our program could be a local table that resides in our front-end or a linked table that resides in a separate back-end location.
The back-end does not necessarily refer to a single file or database. Each linked table can link to a different data source. When we talk about the "back-end," we are referring to one or more external data stores, regardless of what form that data storage takes. The "back-end" can actually be a collection of Access database files (.mdb
or .accdb
), SQL Server databases, or any other program or file that supports ODBC connectivity (text files, Excel, SQLite, MySQL, Postgres, etc.).
I should point out here that I always use DSN-less connections when linking to SQL Server and other databases. I do this for two main reasons: 1) it simplifies deployment, and 2) it works better with version control. I could say plenty more about the topic, but that is way beyond the scope of this article.
The _TableList.table file
Once your schema is in place, it's relatively rare to have changes to the .table files that hold the field and index information for each table. However, the TableDef.Connect
property will often change en masse when switching data environments, such as from development to production.
If we included the .Connect
property in each individual table file, there would be a huge number of changed files every time we switched environments. A single change like that, especially one that happens relatively often, should not trigger a slew of changes to dozens--or even hundreds--of files. All that noise will lead us to miss meaningful changes between revisions.
To avoid this problem, I store all of the TableDef.Connect
strings in a special text file named _TableList.table. There's nothing special about this name, other than it is consistent across all of my repositories. The consistency is important. Feel free to change it to whatever you like, but DO NOT change it once you start using it.
Being able to perform a multi-file search through every file named _TableList.table in your repository folder is an enormously powerful and useful feature. This feature alone is worth the price of admission the first time you need to support a database server migration. Finding every table linked to that server is dead simple. Doing the same thing manually would take way longer and almost certainly lead to missed tables, especially if you have several applications that connect to a single server.
Sample code
'db: an open DAO.Database object (the front-end file)
'ExportPath: the folder (without trailing slash) where the text files will be saved
'oApp: an open Access.Application object
Public Sub ExportTables(db, ExportPath, oApp)
Dim td, objFSO, f, prop, fld, ix, x, TableList
'On Error Resume Next
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set TableList = objFSO.OpenTextFile(ExportPath & "\_TableList.table", 8, True)
For Each td In db.TableDefs
Err.Clear
'Exclude system tables and hidden tables
Dim acTable
acTable = 0
If Not Left(td.Name, 4) = "MSys" And Not oApp.GetHiddenAttribute(acTable, td.Name) Then
On Error Resume Next
WScript.Echo "Tables: " & td.Name & " (" & td.Fields.Count & " fields, " & td.Indexes.Count & " indexes)"
If Err Then
'LogError ExportPath, "Error exporting table: " & td.Name
Err.Clear
End If
On Error Goto 0
If UCase(td.Name) = "AUX" Then
'Aux is a reserved filename in Windows
Set f = objFSO.OpenTextFile(ExportPath & "\" & td.Name & "_.table", 8, True)
Else
Set f = objFSO.OpenTextFile(ExportPath & "\" & td.Name & ".table", 8, True)
End If
'Write connect strings to a separate file to minimize the number of modified text files when switching db's
TableList.WriteLine td.Name & " Connect: " & td.Connect
f.WriteLine "Table Properties"
f.WriteLine "----------------"
On Error Resume Next
f.WriteLine " Description: " & td.Properties("Description").Value
On Error Goto 0
f.WriteLine " Attributes: " & td.Attributes
f.WriteLine " SourceTableName: " & td.SourceTableName
f.WriteLine ""
f.WriteLine "Fields"
f.WriteLine "------"
For Each fld In td.Fields
f.WriteLine fld.Name & " [" & DbTypeToString(fld.Type) & "] (" & fld.Size & ")"
On Error Resume Next
f.WriteLine " Description: " & fld.Properties("Description").Value
On Error Goto 0
f.WriteLine " Attributes: " & fld.Attributes
f.WriteLine " Ordinal Position: " & fld.OrdinalPosition
f.WriteLine " Default Value: " & fld.DefaultValue
f.WriteLine " Required: " & fld.Required
f.WriteLine " AllowZeroLength: " & fld.AllowZeroLength
Next
f.WriteLine ""
f.WriteLine "Indexes"
f.WriteLine "-------"
On Error Resume Next
For Each ix In td.Indexes
f.WriteLine ix.Name
For Each fld In ix.Fields
f.WriteLine " " & fld.Name & " {Attributes: " & fld.Attributes & "}"
Next
For Each prop In ix.Properties
If prop.Name <> "DistinctCount" Then
f.WriteLine " " & prop.Name & ": " & prop.Value
End If
Next
f.WriteLine ""
Next
On Error Goto 0
f.Close
Set f = Nothing
End If
If Err Then
LogError ExportPath, "Error exporting: " & td.Name
Err.Clear
End If
Next
Set objFSO = Nothing
End Sub
Windows Reserved File Names
If you read the sample code carefully you will come across an If statement where I'm adding an underscore to the filename for an "Aux" table that exists in one of my applications. I'm doing this because Aux.*
is a reserved file name in Windows.
In fact, there is a whole list of such reserved file names. I did not include the entire list in my script because VBScript does not support the Select...Case
statement. I could probably come up with a clever way to check for every possibility without adding 20 clauses to my If statement, but Aux
is the only conflict I've encountered with my own applications. Just be aware that you may need to adjust the code if, for example, you have a table named "CON" in one of your programs.
HowToGeek tells the interesting backstory about why Windows 10 still has this strange constraint:
Now, it’s over forty years later and we still can’t name files “con.txt” or “aux.mp3” because Windows wants to stay compatible with ancient programs that might be using this feature. It’s a good example of how intensely Microsoft is committed to backwards compatibility.
I told you Microsoft has reverence for backwards compatibility.
Image by Free-Photos from Pixabay