All of my programs tend to be rather rudimentary console applications in C. For example, I may write some code to parse a file header and then print some data from the header to the screen. To do this, I would just use functions/symbols from stdio.h
stdlib.h
, string.h
, stdbool.h
such as printf()
, fopen()
, fread()
, etc... I usually get away with writing my code in the main.c file as well as several .h files and .c files to go along with them. When it comes time to compile, I will do something like: gcc main.c file1.c file2.c -g -Wall -o my_program
The program runs fine, and with my colleagues, I simply share the source code, or if they're on the same OS, I share the binary and they can typically just either build the code just as I did and run it, or run the binary directly. If my colleague is on a different OS, he/she will just build the source on that machine OR I will build it for them on a machine I have with that OS.
I've never really had to consider how my dependencies were being linked at all in fact. This is probably because I write mostly internal tools and am not releasing to large audiences. That being said, in which situations would the above method fail to run on a system? Is it possible that somebody who has the same version of gcc installed would not be able to just run my executable or build the code themselves, then run it, when I'm only using std C functionality? In fact, I've taken my very same C code from a linux box, copy/pasted it into Visual Studio and compiled with MSVC, and it still works fine with the standard functions... So even cross-compiler, I've not needed to think about the linking yet.