60

Installing something in windows takes a click of a button. But every time I try to install something in linux, which is not found in APT, I get so confused.

You download a zipped folder, then what? If you are lucky there is a README, refering to some documentation, which might help you, if you are lucky.

What is the magic trick when "installing" extensions and applications which aren't found in APT?

I love linux, but this problem haunts me every day.

  • 5
    See http://en.wikipedia.org/wiki/Filesystem_Hierarchy_Standard –  Feb 23 '11 at 11:57
  • 8
    Offtopic. But the simple trick is to never install anything outside of the package management system. – Šimon Tóth Feb 23 '11 at 11:58
  • 1
    If it's not from APT then it probably isn't something you can just install - you'll need to compile and install it yourself. And then it'll start getting confused with APT-based packages. It's simplest to just find APT / .debs for whatever you need as far as you can. – Rup Feb 23 '11 at 11:59
  • 2
    @Rup Plus if a software doesn't have a deb package you probably shouldn't install it, since it's either something deprecated, or bleeding edge, or incompatible with apt-based distributions. – Šimon Tóth Feb 23 '11 at 12:05
  • 3
    Once you get enough experience with building/installing software, you can just create your own packages. Please be sure to provide feedback to the upstream provider! – jsbillings Feb 23 '11 at 14:50
  • Right, usually software comes either as part of a pre-built package (RPM, deb, etc) or it includes a Makefile that allows you to run make install to install it in the appropriate place on your Linux system. – Justin Ethier Feb 23 '11 at 14:59
  • not all linux distros were made equally. Same goes with package managers. I can say that pacman keeps me pretty happy on arch. If you are use Windows and want to get more familiar with linux style, you can get http://msys2.github.io/ and you will have a very nice pacman port that is under active development. – cchamberlain Jul 22 '15 at 03:28

4 Answers4

34

If it is a software which obeys the Filesystem Hierarchy Standard than you should place it in /usr/local and the appropriate subdirectories (like bin, lib, share, ...).

Other software should be placed in their own directory under /opt. Then either set your PATH variable to include the bin directory or whatever directory which holds the executables, or create symbolic links to /usr/local/bin.

27

There is no simple answer to this question, but I can give you a general outline of how it works:

Most Linux software is provided by the authors (the "upstream") in source code form. This allows everyone who has a compiler for their particular platform and system configuration to download the sourcecode and compile it themselves. Unfortunately for you, many programs rely on functions provided by other programs and software libraries (dependencies).

Windows software usually comes in precompiled form. That means there's one generic executable file for all Windows computers, and the dependencies often come with it in the install package.

Linux distributions take the sourcecode, precompile it for you and offer it to you as a package, too. The package doesn't include the dependencies, but it refers to them and forces the package system to install them as well (which can sometimes lead to mess-ups which you've probably experienced yourself already).

If there is no precompiled package, you can always download the source code and compile it yourself. Most of the time, the following will work:

./configure
make
(sudo) make install (or sudo checkinstall)

The ./configure line sets the stage for the compilation process (and spits out errors if dependencies aren't met). The make line will execute the Makefile, a script that compiles all parts of the program.

Traditionally, you would use make install to then install the software. This usually puts the executables in /usr/local/bin.

Since you're using apt, I very much recommend getting checkinstall. You can use it in place of make install, and it will generate a .deb package for you. This makes it much easier to cleanly remove the software later on.

Note that there are a handful of other compilation sytems, for example cmake; and some software comes precompiled but unpackaged (in which case you can start it right from the unzipped folder); and some software comes as a collection of scripts you have to run yourself. Fresh code from SVN sometimes comes without configure scripts, so you have to first run the autoconf toolchain ... etc, etc ... you see there are lots of exceptions to the rule, but with a little experience you'll be able to tell what to do with most of those mysterious downloads. Configure-Make-Checkinstall is a good first start.

PS. Spend a weekend or two to learn how to program yourself, and things will become very obvious :-)

PPS. You may wonder why Linux software authors don't just provide precompiled packages instead of the sourcecode. Well, they sometimes do. But different platforms and Linux distributions all have their own package formats and file system rules, so as a developer you'd have to provide packages for every possible configuration -- which is a pain. Ubuntu packages are often the easiest to find though -- you should find out what a PPA is and how it works!

sk29910
  • 429
  • 1
    "PS. Spend a weekend or two to learn how to program yourself, and things will become very obvious :-)". Bad advice in an otherwise excellent answer. Asking people to program to understand linux is like airline passengers to repair planes. – apoorv020 Feb 24 '11 at 07:23
  • instead of programming just change it to compile your own linux, like LFS (linux from scratch) http://www.linuxfromscratch.org/ – jsolarski Feb 24 '11 at 09:17
  • @apoorv, point taken. :) – sk29910 Feb 24 '11 at 16:34
4

You should check out checkinstall. Instead of

./configure
make
sudo make install

you do

./configure
make
sudo checkinstall

and you'll be able to manage that package as if you had installed it through apt.

Michael Homer
  • 76,565
mgalgs
  • 167
1

There is a valid, sensible reason this is so confusing (there is also an annoying artifact reason)...

Unix has a history of being multi-user and most users did not have access to install apps outside areas they had been granted specific access to.

So the theory would be that you would build something in your home directory, then copy it to an area you had control over (your own project area, or a shared area).

Windows PCs are generally single-user systems and don't have this constraint, everything goes in Program Files no matter what.

Then there is the stupid, annoying fact that every time a new version of Unix came out the creators felt it necessary to change locations, but the old ones had to still be there for automated scripts. This gives you a bunch of linked directories serving the same purpose.

The init system is even worse.

Bill K
  • 294
  • 2
  • 10