Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » Distributing for Linux

This thread is locked; no one can reply to it. rss feed Print
Distributing for Linux
Ariesnl
Member #2,902
November 2002
avatar

What is the best and most reliable way to distribute your software for Linux ?
On Windows there is msi (installshield, innosetup,wix)
Since my targeted project is open source, distribution with sourcefiles is preferred. But for those who can't compile it, I would like to provide the bin files too

Perhaps one day we will find that the human factor is more complicated than space and time (Jean luc Picard)
Current project: [Star Trek Project ] Join if you want ;-)

amarillion
Member #940
January 2001
avatar

Since this is an open source project, I recommend to just distribute the source as tar.gz with makefile included

Then, to leverage the open source infrastructure, then create packages in deb/rpm format. You can start distributing a deb from a private package archive (ppa), and if that works well, submit to the main repositories of debian. That will make it very easy for linux users to install your software.

--
Martijn van Iersel | My Blog | Sin & Cos | Tegel tilemap editor | TINS 2017

Ariesnl
Member #2,902
November 2002
avatar

How should I configure the makefile so it's actually installed as a program ?
I'm quite new to writing makefiles

Perhaps one day we will find that the human factor is more complicated than space and time (Jean luc Picard)
Current project: [Star Trek Project ] Join if you want ;-)

bamccaig
Member #7,536
July 2006
avatar

In theory a static build of everything is the easiest. This would be a monolithic binary that contains duplicates of all binary code needed to run the program. There would be no dynamic linking at run-time. Everything needed would be in the binary. I'm not 100% sure if this is possible with the C runtime itself or if that always needs to be dynamically loaded, but I can't imagine a technical reason why that would be...

Alternatively, if you could link your program with local copies of all of the binaries and ship all of them (including the C run-time) along with a launcher script to make sure it links to the correct libraries/runtimes then that might work too.

I like amarillion's solution of providing packages too. In theory, the static build is the most reliable. It should continue to work until the kernel doesn't support the binary format. I'm not sure how things like video access would be handled in this theoretical case. In theory, you'd have static binary code that invoked the GL library API and those would call into the kernel appropriately... Should be good, maybe. If you can get a dynamic build that loads only your bundled binaries, including a bundled C runtime, that should be pretty reliable too.

The packages will need to probably be rebuilt for each distro release that you support to guarantee they will work. That adds a bit of trouble for you (or your build server, if you can figure one out or find one). You could theoretically combine the ideas to bundle the static or "relative dynamic" builds into a package so that users can easily install those as well, but that should be optional since downloading and running them in place should be fine...

Essentially there's no silver bullet... I'm skeptical about the "ideal" cases so you'll likely have to fight to make that work, if it's even possible.. But the packages will require you to keep up to date with them...

Append:

Generally speaking installing in Linux just means copying the outputs to some prefix tree. Typically /usr for system stuff, /usr/local for manual stuff, and $HOME for personal stuff. Within the PREFIX you typically have bin, include, lib, and share directories where executables, headers, libraries, and misc (e.g., documentation) are installed. The Makefile would have a specific .PHONY target, typically called install, that will do the copying. Often there is an install utility to help with defining permissions while copying (see man install).

A contrived example might be:

PREFIX=/usr/local

.PHONY: install

install:
    install bin/stripe_poker $(PREFIX)/bin/
    install lib/libsp_game_engine.so $(PREFIX)/lib/

There are various degrees of complexity...

Mark Oates
Member #1,146
March 2001
avatar

How does Steam distribute games on Linux?

RPG Hacker
Member #12,492
January 2011
avatar

How does Steam distribute games on Linux?

I can give some information because I stumbled upon this just today at work (although this was for the Windows version of a game, but I assume it's fairly similar for Linux).

When configuring your game on Steam, it already offers you a list of popular redistributables often needed for games. You can just tick all of the redistributables your game might require. Included are things like Visual Studio C++ Redistributables, DirectX Redistributables etc. You can, however, also provide a script for custom redistributables for your game, which you can use for all the dependencies not on the list. Alternatively, you can just include all the DLLs your game needs in the game package, in which case they will just be installed alongside it.

I imagine for Linux this is very similar.

bamccaig
Member #7,536
July 2006
avatar

I'm pretty sure Steam for Linux goes with the "bundle all the dynamic dependencies and configure the environment to use them instead of the system ones". In particular, it needs to make sure it chooses the right C runtime because I don't think builds are compatible if the version isn't almost exact... I can double check when I get home because I have that installed, but typically they take a Windows-like approach of just bundling all the dependencies alongside. It's just a bit more tricky because unlike Windows where the "current directory" is preferred for binaries at load time, including libraries, Linux typically only searches specified paths unless overridden explicitly.

gillius
Member #119
April 2000

My understanding of Linux static builds is that the GNU C runtime on mainstream Linux distributions (i.e. not embedded) is very backwards compatible, so typically you link dynamically only to libc and a few other common libraries. From what I've heard, you must dynamically link to libc (there exist other implementations that could be static linked).

As an example I am familiar with, Oracle Java comes as a single binary blob download that runs on all Linux distributions. Obviously Oracle Java is a complex beast using X libs, alsa, pulseaudio, fonts, etc., so they must have some good solution for this. Here is the ldd output for Java 7 64-bit:

$ ldd /jdk1.7/bin/java
        linux-vdso.so.1 =>  (0x00007fff423ff000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007fbfb423d000)
        libjli.so => /jdk1.7/bin/../lib/amd64/jli/libjli.so (0x00007fbfb4025000)
        libdl.so.2 => /lib64/libdl.so.2 (0x00007fbfb3e21000)
        libc.so.6 => /lib64/libc.so.6 (0x00007fbfb3a8d000)
        /lib64/ld-linux-x86-64.so.2 (0x00007fbfb4465000)

$ ldd /jdk1.7/bin/../lib/amd64/jli/libjli.so
        linux-vdso.so.1 =>  (0x00007fff51dee000)
        libdl.so.2 => /lib64/libdl.so.2 (0x00002ad011b36000)
        libc.so.6 => /lib64/libc.so.6 (0x00002ad011d3b000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x00002ad012094000)
        /lib64/ld-linux-x86-64.so.2 (0x0000003446800000)

libjli.so is a so included with the Java distribution itself, as you can see it doesn't link to anything else. That suggests you only need to link dynamically to libc, libpthread, and ld-linux-x86-64.

I remember from C development when you linked to libc in this way there was some symbol you defined to tell GNU libc that you required some minimal version of libc and the program would fail to load at runtime if the libc there was super old. But I can run this modern install of Java on a 5+ year old Linux system, so libc must be extremely stable.

If anyone can produce an Allegro so that only links to these libraries, I would be interested in it too for my jalleg binding project so I can provide a single JAR to run on all Windows and Linux.

Gillius
Gillius's Programming -- http://gillius.org/

David Couzelis
Member #10,079
August 2008
avatar

The optimal solution for open source Linux software is to provide the source code with a Makefile. Then, the package maintainers for the different distributions, for example Debian, Arch Linux, or even the FreeBSD operating system, will use your source code and Makefile to create the package for it.

The standard installation procedure for open source software is "./configure --prefix=/usr && make && make install". There are various software projects that are designed to provide this method of installing software in a portable way. I choose to use GNU Autotools. You can read my tutorial about how to quick and easily setup your project to use GNU Autotools here:

https://bbs.archlinux.org/viewtopic.php?pid=1255512#p1255512

And my latest Allegro 5 project also uses it, which can be seen here:

https://github.com/drcouzelis/colorwandcastle

Ariesnl
Member #2,902
November 2002
avatar

Thanks !

Perhaps one day we will find that the human factor is more complicated than space and time (Jean luc Picard)
Current project: [Star Trek Project ] Join if you want ;-)

Go to: