This is a continuation of my Gnome 3
notes.
Disclaimer - editing system files can break your system, all
information is presented as-is and without warranty.
5/19/12
Summary of Configuration Changes
I made to Ubuntu 12.04
The default Unity and Gnome Shell interfaces don't work well for
me, mainly because of a lack of a normal app menu and a task bar
that supports window buttons. Unity is somewhat better but its
taskbar icons represent apps, not windows, and more often than not
I have several instances of the same app running (terminal
windows, file manager windows, edit windows etc). This situation
can be corrected by installing a traditional panel such as
lxpanel, which with some configuration works with both Gnome Shell
and Unity. Other options that provide an old-style Gnome-based
interface with the nautilus desktop include Gnome Classic
(gnome-panel), Cinnamon, Mate, and creating custom Gnome sessions.
Non-Gnome interfaces that can be installed include KDE, LXDE and
XFCE. These are all just different interfaces to the same
underlying operating system (in Linux the operator GUI is really
just another app), for the most part they all run the same apps
and can be installed in parallel. The desired GUI can be selected
from the login screen.
Gnome 3 makes it fairly easy to create custom sessions by adding
files to the /usr/share/gnome-session/sessions and the
/usr/share/xsessions directories. The xsessions directory contains
.desktop files that are made available from the login screen and
specify which .session file to run from the sessions directory.
Not all components (including lxpanel) can be run directly from a
session file, so I add a script to my autostart programs that runs
additional components based on the contents of the DESKTOP_SESSION
variable, which is set to whichever .session file is running. When
playing around with custom sessions it's a good idea to (at least
at first) set up the system so that it does not automatically log
in, otherwise if something goes wrong you might end up in a
non-functional session with no easy way to change to a working
session - if that happens you can boot to recovery mode, navigate
(cd) to /usr/share/xsessions and rename (mv) the non-working
.desktop file. It also helps to create desktop launchers that run
the Gnome logout/restart tool, a terminal and a file manager (or
enable the "Home" icon) so if the panel fails to load you can
still operate the session to fix it or get out of it. Gnome 3
removed the option to create desktop icons but the function can be
accessed by installing gnome-panel and running the
desktop-item-edit applet, I use a nautilus script to make it easy
to run.
This material assumes basic knowledge of how to use a command
line, navigate the file system, and creating scripts. Scripts must
be set to be executable to be able to run them, to create a new
script right-click in the directory you want to create the script
in and select Create Document then Empty File, rename the file to
what the script filename should be, then right click the script
file and select Properties, click on the Permissions tab, then
check Allow executing as a program. By default when executable
scripts are double-clicked a dialog is displayed giving the
options to Run in a terminal, Display, Cancel or Run - to edit
scripts select Display. Only scripts under your home directory can
be directly edited, scripts in system directories require root
permissions to create and edit. Two ways to do that - one way is
after navigating to the directory in a terminal enter gksudo gedit
file_to_edit (presently there's a bug where it creates a second
empty but modified tab, click it gone with no save, see below for
a workaround) - another way is to make a nautilus script that runs
nautilus in root mode then you can create/edit system files
normally. Be very careful when editing files in system
directories, a typo can result in a non-booting system - it's a
good idea to know how to use recovery mode and the command line to
fix things when you break them (nano can be used to edit files
without a GUI). Note - the nautilus file manager hides the true
filename of .desktop files, to see and edit these use a terminal
and run the editor directly - if in a system directory then use
gksudo gedit filename.desktop to create or edit them.
Tinkering with the guts of the system can be daunting at first, but it becomes fairly easy after some practice. Main rule is if you don't know what a file does, don't mess with it. The mods in this section do not change existing files so should be safe enough even if something goes wrong, but remember there's no warranty, if you break it you get to keep all the pieces. If unsure or you don't want to take any chances then don't try it, stick with the pre-configured sessions (install gnome-panel to use Gnome Classic etc).
To get started, use Software Center to install Synaptic, Software
Center is nice for browsing but it's a whole lot easier to use
Synaptic to install packages when you already know what you want,
plus if using Unity you probably don't want it adding a launcher
for everything installed (and for some things a launcher won't
work anyway). You'll probably have to jump through some hoops at
first to enable additional repositories. Once Synaptic is
installed and all the repositories are enabled, installing
packages typically is just a matter of typing in the package name
or key word into the quick search box, checking the package,
clicking apply and confirming the changes. To duplicate something
like my setup install gnome-panel and lxpanel. Other almost
must-have packages include gnome-tweak-tool (which pulls in
gnome-shell), gconf-editor and dconf-tools.
To make things easier I use nautilus scripts, which (once
enabled) can be accessed by right-clicking and selecting Scripts.
To enable nautilus scripts, open the file manager and navigate to
[your home dir]/.gnome2/nautilus-scripts (in these docs your home
dir is often specified as ~) (to access files beginning with a dot
do View then check Show Hidden Files - might want to go to
Preferences and make that permanent), then add a script. I use the
following scripts...
Terminal...
#!/bin/bash
gnome-terminal &
CreateLauncher...
#!/bin/bash gnome-desktop-item-edit --create-new $(pwd) &
BrowseAsRoot...
#!/bin/bash gksudo "nautilus $(pwd)" &
Once scripts have been added to nautilus-scripts, the Scripts
menu is enabled and contains an option to open the scripts
directory.
To work around the gedit bug where it creates an empty modified
edit tab, add the following system script...
gksudo gedit /usr/local/bin/mygedit (or name it however you want
- just not gedit or any other existing command name)
#!/bin/bash gedit "$1" < /dev/null
With this script in place you can do gksudo mygedit file_to_edit
without the extra tab being made.
Now to create a custom "LxGnome" session... Add the following
system files... (see above notes about editing system files)
/usr/share/gnome-session/sessions/lxgnome.session
[GNOME Session] Name=GNOME (lxpanel) RequiredComponents=gnome-settings-daemon; RequiredProviders=windowmanager; DefaultProvider-windowmanager=metacity DefaultProvider-notifications=notify-osd DesktopName=GNOME
/usr/share/xsessions/lxgnome.desktop
[Desktop Entry] Name=GNOME (lxpanel) Comment=Gnome with lxpanel Exec=gnome-session --session=lxgnome TryExec=gnome-session Icon= Type=Application X-Ubuntu-Gettext-Domain=gnome-session-3.0
Add the following script anywhere convenient, I put it in my home
directory for easy editing...
mystartapps.sh
#!/bin/bash if [ "$DESKTOP_SESSION" == "lxgnome" ];then lxpanel & fi
Make sure it's executable. Go to the Startup Applications applet
and add an entry for mystartapps.sh - you'll have to specify the
full path to the script, for my system for the command box I
entered: /home/terry/mystartapps.sh
To create custom shutdown and logout icons on the desktop, use
the CreateLauncher script.
For Shutdown use the command line:
/usr/lib/indicator-session/gtk-logout-helper --shutdown
For LogOut use the command line:
/usr/lib/indicator-session/gtk-logout-helper --logout
Choose whatever icons you want for these, instead of trying to
navigate through the hundreds of icons you can just make them,
then launch a terminal on the desktop (using the "Terminal"
nautilus script), do ls to see the filename, then do gedit
filename.desktop to directly edit the launchers, change the Icon=
line for the logout launcher to system-log-out and change the icon
line for the shutdown launcher to system-shutdown.
Now you can try booting into the newly made lxgnome session - log
out and select GNOME (lxpanel). The first run of lxpanel will be a
mess, don't worry, will fix that. For starters the default system
tray and desktop pager applets will probably be botched,
right-click and remove them. Right-click the panel and select
Panel Settings, click the Advanced tab and set File Manager to:
nautilus set Terminal Emulator to: gnome-terminal and
set Logout Command to:
/usr/lib/indicator-session/gtk-logout-helper --logout Click
on the Appearance tab and set Background to a solid color, click
on the color thing and pick something, probably generic gray, for
coolness can make semi-transparent. Click on the Panel Applets and
add back the system tray, which should add a proper volume control
and network manager icons. To set the clock up to show date and
normal AM/PM time, right-click it, select Clock Settings, and set
the format to the string: %a %b %e %l:%M %p Or to show just
the time without the date use the string: %l:%M %p Change
the other lxpanel options to taste.. add and/or rearrange applets,
edit the launchers to what you want etc.
For eye-candy (shadows, animations etc) install the mutter package then edit the lxgnome.session file to change metacity to mutter. Or make new session/desktop files edited appropriately to select whether to use effects or not.
If anything goes wrong and it doesn't work, log back into Unity
(or Gnome Classic) and figure it out - the above documents what I
did to get the kind of GUI I want but your experience might be
different, and there will probably be other aspects of the system
that need tweaking. I think I covered the basics but if there are
any errors or issues in the above instructions, let me know. Also
these techniques can be used with other panel apps besides
lxpanel, which does have some bugs - the virtual desktop pager
doesn't work quite right, the add-to-desktop feature half-locks-up
and requires the resulting launchers be made executable (it was
designed mainly for LXDE, not Gnome) - but for me being able to
configure it to do what I want makes up for the glitches.
LxPanel can also be used with Unity and Gnome Shell. Since the
panel settings probably need to be different, copy the
~/.config/lxpanel/default directory to a new directory or
directories under ~/.config/lxpanel and rename as needed - I name
new configs to match the DESKTOP_SESSION variable so for Unity
call the new directory "ubuntu", for Unity-2D use "ubuntu-2d", for
Gnome Shell use "gnome-shell". The lxpanel -p option is used to
load alternate configurations. Edit mystartapps.sh to something
like this...
#!/bin/bash if [ "$DESKTOP_SESSION" == "lxgnome" ];then lxpanel & fi if [ "$DESKTOP_SESSION" == "ubuntu" ];then lxpanel -p ubuntu & fi if [ "$DESKTOP_SESSION" == "ubuntu-2d" ];then lxpanel -p ubuntu-2d & fi if [ "$DESKTOP_SESSION" == "gnome-shell" ];then lxpanel -p gnome-shell & fi
Include only the sessions you want lxpanel to run in. BTW the
& symbols at the end of the lxpanel lines tells the script to
launch it and move on, this avoids having the script continuing to
run waiting for the panel to exit, and also permits adding
multiple items if other components are desired for a particular
session.
Presently the default Ambiance theme looks a bit funny to me, so
I made my own somewhat modified
Ambiance theme. To use it copy the tar.gz file to the
~/.themes directory (make it if .themes doesn't exist) then
extract the archive. Use gnome-tweak-tool ("Advanced Settings") to
select AmbianceMod. This has the effect of making any GTK apps
running as root look like junk, to avoid that rename the theme to
Ambiance instead, then it overrides the stock Ambiance theme and
root things get the stock theme.
This covers most of the tweaks I've made to my Ubuntu 12.04 systems (virtual and real). Other tweaks I've made include moving the window controls back to the right, disabling the overlay scroll bar, and/or disabling the global menu but that only matters when using Unity so I don't bother with that one. On my systems I had to uninstall apport to keep it from displaying useless crash warnings when nothing was wrong, that might be fixed now. There are numerous web pages explaining how to make basic Ubuntu 12.04 tweaks so I won't get into that here.
A Test Upgrade from 10.04 to
12.04
4/28/12 - Running from a USB stick is a safe way to test
compatibility on my main system but I want a real install to play
with.. so upgraded my HP Mini 110 that was running Ubuntu 10.04. I
got the current 32-bit 12.04 ISO, used Startup Disk Creator to put
it on a USB stick, booted it on my Mini, ran the install icon,
chose the option to upgrade the existing 10.04 installation
(already selected), used the same user name I was already using
and let it do its thing. I didn't have internet connected at the
time so it couldn't replace outdated installed apps that weren't
already on the ISO, so it told me I'd have to reinstall some
things - that's fine, I didn't want to be accessing the overloaded
repositories during installation and I needed to clean out old
cruft anyway. Rebooted into Unity, connected ethernet long enough
to run the restricted drivers thing which installed the broadcom
wireless driver. Tried to use Software Center to reinstall wine
but it didn't work... just put a non-functional icon on the
launcher. Opened a terminal and used sudo apt-get update then sudo
apt-get install wine - had to accept a license for mscorefonts, I
guess Software Center can't handle text-mode installation
questions (or something else bugged out). Repeated sudo apt-get
install for dosemu gnome-panel and other stuff I need or got
removed that I still want. Logged out and logged into Gnome
Classic - some minor cosmetic issues (already noted) but otherwise
works fine. Did a good job of preserving my data and custom apps,
all my wine/dos apps, usr/local/bin stuff, associations, desktop
icons nautilus-scripts etc all work fine, all I had to do was some
minor editing to a few of my corewar scripts to compensate for the
new version of xterm. The unclean shutdown bug I was getting in
VirtualBox and on an EXT4-formatted thumbdrive doesn't seem to be
a problem on this upgraded EXT3 system, dmesg | grep EXT shows no
errors... whatever causes that doesn't happen on this system.
Sound works and the speakers stop when the headphone jack is
inserted, video works, etc. So far it looks like a successful
upgrade.
4/29/12 - "lxgnome" on my HP Mini 110, using the mutter window
manager and the stock Ambiance theme...
...Nice. For me anyway.. lxpanel doesn't have integrated social
networking and all that stuff like the Unity taskbar, but I don't
use that stuff and I'd much rather have a panel with a normal app
menu and window list. This is using lxpanel's included network
status monitor applet (had to set the config tool to
network-manager to make it work), wireless also shows up with the
systray applet, at first that config (the default when first
running lxpanel) was buggy but after removing systray, adding
lxpanel's own volume and adding systray back it seems to work OK
now. To get effects at first I tried enabling Metacity
compositing, but that was too glitchy... icons leave trails when
moved. Compiz looks good but has a bit of momentary glitching when
restoring minimized windows. Mutter works very well with nice but
not overbearing effects - minimized windows smoothly collapse to
their proper place on the panel. To make the additional sessions I
simply copied my existing lxgnome .session and .desktop files to
-compiz and -mutter copies and edited to change the names and
window manager, added extra entries to my startup apps script
(while at it made a lxpanel config for gnome-shell). In all the
sessions I can run the Unity 2D launcher as needed to make use of
its facilities... not too crazy about Unity as a sole interface
but when paired with a more conventional panel for task
management, Unity provides some really cool features, especially
for searching for files and videos. All one really needs to do to
restore a traditional environment to Unity is to install lxpanel
(or any conventional panel), add it to the startup apps and do a
bit of configuring... the scripts and stuff are needed only if
using multiple sessions.
5/10/12 - quite a few 12.04 updates have come through in the last
week, fixing among things the unclean shutdown glitch and
improperly-updating compiz window title bars under VirtualBox.
Other VirtualBox graphics glitches remain but it's fine with
no-effects metacity sessions... graphics card emulation only goes
so far but that it can do as well as it does is quite amazing..
some video sites like Crackle work better from my virtual 12.04
system than from my native 10.04 system... buffers better with
fewer or no stream lockups.
On my HP Mini 110 I replaced my Unity-On Unity-Off launchers with
a single launcher/script that just turns it on and off...
#!/bin/bash if ps -e | grep "unity-2d-shell";then killall unity-2d-shell else unity-2d-shell & fi
...one less icon matters with a single-bar setup.
5/14/12 [edited 5/19/12] - I had my first real-life app for Gnome
3 / Ubuntu 12.04 - I needed to configure my HP Mini 110 so that I
could edit microcontroller source code written in Great Cow Basic
(that part is easy - gedit), recompile the code using the
GCB compiler configured to use the gpasm assembler (from the
gputils package) to produce a hex file, rig up a simple
serial-port terminal to talk to the gizmo via a generic USB serial
adapter, and upload the hex binary code to the gizmo through the
USB serial interface. This wasn't for some optional hobby project
but for a mission-critical tester used for a product manufactured
by the company I work for, I had to bring the thing home to work
on the code (on my 10.04 machine) but if other changes are needed
I need to have a programming rig I can take to the factory. Sounds
like a good test.
What I want... edit the source code, right click the source file
and select the GCB compiler, double-click an icon or script to
launch the serial terminal emulator, right click the hex file made
by the compiler and select sendhexUSB to send it to the gizmo,
watch the code being uploaded on the terminal. Which is exactly
what I ended up with but getting there showed a Gnome 3 fail in
its default configuration - an inability to associate files to
arbitrary apps.
So... copied over my existing pictools directory containing GCB
and my compiler scripts - it's a FreeBASIC app so already 32 bit,
should work fine (and does), installed gputils from the
repository, found the source for dterm - my favorite lightweight
serial terminal - and tried to compile it, failed, edited the
Makefile to remove -Werror so warnings won't stop it, compiled and
copied the resulting dterm binary to /usr/local/bin. Now to get it
to work... says /dev/ttyUSB0: permission denied. OK... did some
googling, learned I had to do ls -la /dev/ttyUSB0 and note which
group appeared after root (dialout) and add myself to that group.
Not wanting to goof around from a command line I installed
gnome-system-tools to get the traditional "Users and Groups" GUI,
added myself to the dialout group, logged out and back in, now my
script that runs dterm works and I can interact with the gizmo. So
far so good.
Now to associate *.gcb and *.hex files to the appropriate tool
scripts. I used the Assogiate app to create new file types for
text/x-gcb and text/x-hex, adding text/plain to both and adding
the appropriate file masks and file type descriptions. New file
types show up in Nautilus, right-click to add my associations and
that's where I ran into problems - it only can associate to
"officially" installed apps or things already in its association
database (thank goodness when I upgraded it kept all my many
associations!). [...] More here about this association
feature/bug.
5/20/12 - OK I think I figured out this association stuff...
still don't fully understand how it all works but really the core
issue is being able to make an arbitrary binary, script or command
appear in the list of apps that can be associated, and it turns
out that part is easy. The idea is to put a custom .desktop file
in ~/.local/share/applications containing entries like this...
[Desktop Entry] Type=Application NoDisplay=true Name=Name of the application Exec=/path/to/the/executable %f Terminal=false
The key to making it work is the Exec line has to end with %f or
%U or it won't show up in the list of apps that can be associated.
So now just need an easy way to create the .desktop file without
having to jump through a bunch of hoops, the solution I came up
with is this script...
#!/bin/bash # # "AddToApplications" by WTN 5/20/2012 modified 11/17/2013 # # This script adds an app/command/script etc to the list of apps # shown when associating files under Gnome 3. All it does is make # a desktop file in ~/.local/share/applications, it doesn't actually # associate files to apps (use Gnome for that). Delete the added # customapp_name.desktop file to remove the app from association lists. # If run without a parm then it prompts for a command/binary to run. # If a parm is supplied then if executable uses that for the Exec line. # appdir=~/.local/share/applications prefix="customapp_" # new desktop files start with this if [ "$1" != "" ];then # if a command line parm specified.. execparm=$(which "$1") # see if it's a file in a path dir if [ "$execparm" == "" ];then # if not execparm=$(readlink -f "$1") # make sure the full path is specified fi if [ ! -x "$execparm" ];then # make sure it's an executable zenity --title "Add To Associations" --error --text \ "The specified file is not executable." exit fi if echo "$execparm" | grep -q " ";then # filename has spaces execparm=\""$execparm"\" # so add quotes fi else # no parm specified, prompt for the Exec command # no error checking, whatever is entered is added to the Exec line execparm=$(zenity --title "Add To Associations" --entry --text \ "Enter the command to add to the list of associations") fi if [ "$execparm" == "" ];then exit;fi nameparm=$(zenity --title "Add To Associations" --entry --text \ "Enter a name for this associated app") if [ "$nameparm" == "" ];then exit;fi if zenity --title "Add To Associations" --question --text \ "Run the app in a terminal?";then termparm="true" else termparm="false" fi # now create the desktop file - the format seems to be a moving target filename="$appdir/$prefix$nameparm.desktop" echo > "$filename" echo >> "$filename" "[Desktop Entry]" echo >> "$filename" "Type=Application" #echo >> "$filename" "NoDisplay=true" #doesn't work in newer systems echo >> "$filename" "NoDisplay=false" #for newer systems but shows in menus echo >> "$filename" "Name=$nameparm" echo >> "$filename" "Exec=$execparm %f" # I see %f %u %U and used to didn't need a parm, newer systems might be picky # according to the desktop file spec... # %f expands to a single filename - may open multiple instances for each file # %F expands to a list of filenames all passed to the app # %u expands to single filename that may be in URL format # %U expands to a list of filenames that may be in URL format # ...in practice all these do the same thing - %F is supposed to pass all # selected but still opens multiple instances. If %u/%U get passed in URL # format then it might break stuff.. so for now sticking with %f echo >> "$filename" "Terminal=$termparm" #chmod +x "$filename" #executable mark not required.. yet.. # when executable set Nautilus shows name as Name= entry rather than filename # app should now appear in the association list
[updated 11/17/13 to also work with newer versions of Gnome]
To make it easy to use I saved it as "AddToApplications" in the ~/.gnome2/nautilus-scripts directory (executable of course). To add a custom script or binary to the list of apps that can be associated I right-click the file and select Scripts|AddToApplications, give it a name that will appear in the application list (choose carefully to avoid conflicting with existing apps), then click yes or no to specify if the app should be run in a terminal or not. The script will then save a desktop file named "customapp_name.desktop" where name is the name was entered, afterwards the custom app appears in the list of applications that can be associated. If command line parameters are needed then run the AddToApplications script without a selected file or parm, then it prompts for a command line first, the filename being run by association will be added after any parms. Assogiate is still needed to create custom file types based on specific extensions etc but I'm used to that, same with Gnome 2.
5/23/12 - Originally when trying to configure associations on my
Mini 110 I had installed kde-plasma-desktop to make use of its
association utility (and get another alternate desktop environment
to play around with), but it turns out that associations made
using KDE don't show up in Gnome's list of apps that can be
associated, and while at first they appear in the right-clicks of
the associated file types, they're subject to go away if the
associations are further edited. I ended up having to redo the
associations using the AddToApplications script, all good now.
Other utilities that have file association options include Thunar,
PCmanFM and Ubuntu Tweak - but at this point I prefer the
simplicity of the script approach.. I know what my script does,
the other utilities, not so much.
6/2/12 - LxPanel's battery monitor is broken in the current stock
0.5.8 version, at least on my Mini 110. So I downloaded the 0.5.9
source (which mentions that is fixed) and and prepared using the
command ./configure --prefix=/usr so it would overwrite the stock
install. Of course had to keep installing -dev packages until
configure was happy - not a process for the impatient and packages
are often named somewhat differently than what the error message
asks for, might have to guess - but used to it. Once happy did
make then sudo make install... battery monitor works. Sort of,
percentage works, time remaining doesn't. By using --prefix=/usr
the package system still thinks I have lxpanel 0.5.8 so if an
updated version becomes available through the normal channels it
should replace the compiled version.
I've been fooling around with my Gnome Shell session...
Thanks to user contributions Gnome Shell now works for folks like
me that prefer a traditional app menu and window list. I was using
LxPanel for the bottom panel in my Gnome Shell session, but it's
been replaced by the super-cool Panel Docklet extension - I
especially like being able to right-click minimized windows and
get a live overview of what the app instances are doing. I had to
do a bit of configuring - the All-in-one places extension needed
some editing to get it to fit vertically, and used the
gnome-shell-extension-prefs utility to set up the system-monitor
extension so it would fit the panel and not disable the stock
battery monitor icon (which properly shows time remaining when
clicked). The preferences utility can be run through the
extensions.gnome.org website or run manually, to make it easier to
access I used Alacarte (Main Menu) to make a menu entry for it.
Unity 2D is optional, triggered by clicking the yellow shortcut
which runs the on/off script I made. Sometimes Unity 2D is handy
for listing recent files, launching apps, searching, etc. Mostly
it runs fine this way but there are some "free" side effects of
running it like a stand-alone app - occasionally it goes away when
an app is closed (so far that only happens if the app was actually
launched from Unity), and it spams the .xsession-errors file every
few seconds with warning messages - a fix for that is to
delete/rename .xsession-errors and replace with a symlink to
/dev/null (ln -s /dev/null .xsession-errors), I hardly ever want
to actually see that stuff anyway.
6/3/12 - One minor but somewhat irritating bug with Gnome 3 is
how when launching Nautilus or more instances of Gedit the pointer
spins for several seconds although it's still functional. The fix
is to edit the desktop files of the affected apps in
/usr/share/applications - in particular nautilus.desktop,
nautilus-home.desktop and gedit.desktop - and change to
StartupNotify=false (instead of true).
I got a better menu going... the Applications Menu extension is
functional but I don't like clicking the categories to expand, so
got the Axe Menu extension (from the author's site as
extensions.gnome.org is down at the moment). Had to make some
settings changes to make it fit the screen but that was easy to
do, right-click the menu thing and tell it not fixed size, reduce
the icon sizes, eliminate some of the left pane categories, set
category box to scroll. One issue - it shows lots of "debian"
categories with invalid icons that I don't want to see, so I made
a little code change...
.....
while ((nextType = iter.next()) != GMenu.TreeItemType.INVALID) {
if (nextType == GMenu.TreeItemType.DIRECTORY) {
let dir = iter.get_directory();
if (dir.get_is_nodisplay()) continue; //added by WTN 6/3/12
this.applicationsByCategory[dir.get_menu_id()] = new Array();
.....
Got the idea from
here. This is one of the coolest things about the extension
system - it makes it easier than ever to change how stuff works.
Sure there's a risk of breaking something but the shell is pretty
good about saying no if something is wrong and if I totally bork
it I can just boot into another session and remove the code - it's
a whole lot safer than directly editing OS code. Now if I can find
better docs...
6/4/12 - VirtualBox running Ubuntu 12.04/Gnome Shell with Axe
Menu, Panel Docklet and other extensions...
The setup on my HP Mini 110 is very similar except no Places
section to save vertical space. The Restart Shell option is handy
when running under VirtualBox... fixes the corrupted "Activities"
screen and reinitializes extensions after resizing the VB window
or making code changes.
8/30/12 - a screenshot of my present HP Mini 110 Gnome Shell
setup...
The maximus extension is currently disabled - although it
provides more screen space when an app is maximized, it makes it
harder to close or restore maximized apps. The main reason I was
using it was for Firefox, and in that app I can simply press F11
to toggle full screen. The Reversi game is OTHOV, something I made for a hacked version of hpbasic running
in simh simulating a
HP21xx-type
minicomputer.
2/16/13 - my current 12.04 "lxgnome_effects" (compiz) session
running under VirtualBox...
The VM screen width was reduced for the screenshot. Normally the
Unity 2D panel on the side isn't running, toggled by the yellow
panel icon.. useful for some things like recent documents and
searching for videos. For that matter, usually I just run this in
a plain lxpanel metacity session (effects slow things down,
especially in a VM) but compiz seems to working well. Also works
with mutter but under VB it causes an irritating background color
flash when selecting menu options. The theme is a modified version
of Ambiance with a dark Nautilus side panel and tweaks to make
tabs and scroll bars look better (to me anyway). LightDM now
supports timed-autologin, yay.. that was almost a dealbreaker
since I like to turn on my computer and it just boot, and GDM is
still semi-broken in 12.04.
My main machine is still running Ubuntu 10.04 and although for
the most part it works very well and I'm in no hurry to upgrade,
desktop support is ending in a couple of months. Support for
non-GUI and I presume security updates will continue for 2 more
years, if that includes FireFox I'm tempted to stick with 10.04
for now - the real determining factor is going to be support for
10.04 by other providers (LibreOffice, VirtualBox, etc). But at
some point I'm going to have to upgrade and I do not look forward
to that - very disruptive as in have my stuff stops working until
I fix it. As far as I can tell most of what I need will be
fixable, just time-consuming redoing PPA's, installing new
versions etc. The main issue I can see is getting temperature
indicators to work with my hardware (a MSI KM4M-V motherboard with
a cheap NVidia graphics card).. I can so long as I stick with
Gnome Panel (I made a test USB install and verified that it can be
done with a bit of compiling and hacking), but the temp indicators
for LxPanel and Gnome Shell don't easily support adding an offset
- my sensor returns readings relative to "too hot" rather than an
actual temperature. The main reason I even need a temp sensor on
this machine is the fan speed driver appears to be in the bios
itself and doesn't always increase fan speed so I need to keep an
eye on the CPU temperature or it will shut down... tempted to
permanently fix that by adding a hardware fan speed control
circuit.. and it would probably be easier to add a thermometer I
can stick on the outside of the case rather than trying to hack in
software support - it gets old having to recompile or reinstall
the custom K10temp driver every time the kernel changes.
GUI-wise, a session running Nautilus and LxPanel seems to be the
most attractive for what I do - I like simplicity, the extra
screen space, and a normal app menu - but I'll have to trim down
the number of launchers in the panel to leave as much room as
possible for window buttons (when I'm working I typically have
many open windows and need to quickly select them with a click).
Gnome Shell with extensions or Gnome Panel are other possibilities
but then I'm back to a 2-panel setup. Gnome Panel has retract
buttons to reclaim space as needed but the new one isn't quite as
configurable as the old Gnome 2 panel (but it is less buggy other
than some fixable theming issues). Gnome Shell with certain
extensions looks better but is more bloated and so far haven't
found an extension that provides a normal app menu that doesn't
cover up half the screen with eye candy or require extra
clicking... when I'm working I don't care about visuals as much as
being able to get to my work apps, files, directories and open
windows as quickly as possible - I want my OS to do what I tell it
to then get out of my way, not be the main show. No OS is perfect
(especially stock - invariably they seem to be designed for
someone else), but there are many GUI choices for Linux-based
distributions.. and even more choice when one mixes up the pieces.
Surely one of the supported options will do fine once I get past
the upgrade and get used to it.
Ubuntu
12.04 on my main machine
[my main machine is at the moment a ZaReason Limbo 6000A tower
from March 2011 with a quad-core 3ghz AMD Athlon II X4 640
processor, since upgraded with a cheap NVIDIA GT210 graphics card,
8 gigabytes of RAM and a 1T hard drive in addition to the 500G
drive it came with.]
5/11/13 - Did it... upgraded my main production system from 10.04
to 12.04 using the update manager app. Went suprisingly well...
after making a full image backup of my hard drive and making sure
I could mount it, clicked the big fat upgrade button and let it do
its thing. Wasn't exactly automatic and there were glitches - had
to select the display manager (lightdm), decide whether to keep
certain config files (kept grub, replaced the rest), a few buggy
dialogs with bunches of unreadable blocks instead of text
(hopefully not saying anything important, hit enter enter to
clear), told it to keep "obsolete" software (some of that stuff I
need for manually installed apps), tons of error messages in the
console as it was upgrading but that's pretty much normal, upon
reboot got a scary message about disk with /tmp not available but
did nothing and it continued on its own, booted into Unity.
Hillarious stack of icons on the side, logged out and back into
LXDE (which was previously installed) to set about fixing stuff.
Used Synaptic to fix a couple of broken packages, to remove one
broken package had to make an empty directory it was complaining
it couldn't change to. Installed gnome-panel for a more sane UI
and set about putting my system back together... not much got
clobbered, reinstalled wine, AcroRead, Google Earth (which
installed ia32-libs, need that for many other things), fixed up
gnome-panel with the stuff I like including adding my modified
Ambiance theme, had to manually download and compile sensors
applet to get CPU/Video temp back. Started the upgrade process
about 9am, about 12 it was done, by 3pm had a functional system
that for the most part is just like Gnome 2. After a bit more
tweaking... [updated screenshot 5/13/13]
I can live with that (as in Yay It Worked).
It's not perfect - a few (expected) cosmetic glitches related to
theming, especially with GTK2 apps... messing around with
gtk-chtheme helps - but it came out a whole lot more perfect than
I was expecting. Actually, doing an update manager
upgrade-in-place on such a "used" system I was expecting an
unbootable mess and was prepared to do lots more fixing and maybe
delete it all and install from scratch, but thought I'd try the
update manager method first just in case it worked so I could
avoid having to redo all my custom stuff - and I was not
disappointed! I'm sure there will be things to fix or adapt to,
but it looks like most of my scripts, associations, virtual
machines, wine apps, dosemu, manually installed apps and other
stuff I depend on made it through the upgrade process - it would
have taken a week or more to get to this point with a fresh
install. The downside is lots of leftover cruft that no longer
applies but that's OK.. easier to ignore the extra stuff than to
figure out what is truly no longer needed.
5/13/13 - Replaced the screenshot from 5/11, not much changed but
now using indicator app complete instead of separate indicators
with a separate clock and having to run gnome-sound-applet in
startup apps to get a volume control (that conflicts with other
sessions), and configured a new session to use mutter instead of
no effects. I had issues using the Classic with effects (compiz)
session - the animations weren't exactly smooth and the desktop
pager broke after telling it to use 2 workspaces (clicking the 2nd
workspace resulted in an empty screen - no panel icons right-click
or anything). I'm not into heavy graphics but basic things like
shadows and simple animations are useful and I liked what I saw in
Gnome Shell (after adding lxpanel and tweaking it a bit), which
uses the mutter window manager. So installed the standalone mutter
package then copied the existing fallback session files and edited
them accordingly...
----- file /usr/share/xsessions/gnome-classic-mutter.desktop ---------------
[Desktop Entry] Name=GNOME Classic (Mutter) Comment=This session logs you into GNOME with the traditional panel using the mutter WM. Exec=gnome-session --session=gnome-classic-mutter TryExec=gnome-session Icon= Type=Application X-Ubuntu-Gettext-Domain=gnome-session-3.0
----------------------------------------------------------------------------
----- file /usr/share/gnome-session/sessions/gnome-classic-mutter.session --
[GNOME Session] Name=GNOME Classic (Mutter) RequiredComponents=gnome-panel;gnome-settings-daemon; RequiredProviders=windowmanager; DefaultProvider-windowmanager=mutter DefaultProvider-notifications=notify-osd DesktopName=GNOME
----------------------------------------------------------------------------
Also mucked around using dconf-editor - mutter causes a very
minor cosmetic glitch in the left-side title bar menu button,
right-clicking the title bar brings up the same menu so set the
key /org/gnome/desktop/wm/preferences/button-layout to
":minimize,maximize,close" (no menu). Did other stuff with
dconf-editor to set up things like I want.. like adjusting the
clock format under /com/canonical/indicator/datetime and enabling
mutter's edge tiling under /org/gnome/mutter. I occasionally use
Unity 2D (the yellow panel icon turns it on and off) so set the
key /com/canonical/unity-2d/launcher/hide-mode to 2 to enable
dodge-windows. Used Gnome Tweak Tool to enable other stuff, mainly
affecting the Gnome Shell session but there's some overlap like
showing mounted drives on the desktop. Other tweaks include
uninstalling the overlay scrollbar packages (I want plain
scrollbars not popup widgets), added my CreateLauncher and
AddToApplications scripts (see 5/19/12 and 5/20/12 entries) to
nautilus-scripts to make up for removed features. All my existing
desktop shortcuts and file associations made it through the
upgrade process so haven't needed the scripts yet, but I'm sure at
some point I'll need to make a launcher or add a binary or script
to the list of apps that files can be associated to.
So far so good... it's better than Gnome 2. Upgrading in place is
usually not recommended but in my case it kept the vast majority
of my customizations intact saving a huge amount of time, I know
how to fix glitches like conflicting/broken packages, and I made a
full backup image of my existing 10.04 disk before attempting the
upgrade so if it didn't work I could restore things to the way
they were or be able to pull stuff over after installing from
scratch. Despite dreading it for a year the upgrade seems to be a
phenomenal success, my system is snappier than it has ever been.
Other than a couple cosmetic bugs in gnome panel that I don't care
about (can't usefully make the panel transparent because widgets
use the background from the theme, and sometimes the text
temporarily overlays the icon on the first window button), a
somewhat different (better) widget set, a few differences that
make it work and look better, and more up-to-date apps (the main
reason for upgrading), it basically IS my old system with a shiny
new skin.
6/22/13 - After yesterday's update of mesa, the mutter window
manager no longer works properly, breaking my gnome-classic-mutter
session and also Gnome Shell which now just loads a disfunctional
desktop with no panels etc. Mutter works (and restores classic
functionality) if I run the command mutter --replace in a
terminal, but the same command run from a GUI tool or
mystartapps.sh does not work.. has to be run from an open
terminal. So... until whatever broke stuff gets unbroken, using
plain Gnome Classic with metacity. At least it's faster without
effects...
This might be a malfunction in my video card or maybe the NVIDIA
driver - mutter fails horribly in VirtualBox now (totally garbled
graphics) and that virtual system has not been updated. I have
been having video-related flakes lately - sometime back the system
started intermittently failing to bring up the GUI (after a kernel
update), a problem that was solved by installing the latest NVIDIA
driver (which wasn't exactly easy - I failed to uninstall the
previous version first which caused all sorts of issues). Now this
issue - it seems
correlated with the mesa update but with no easy way to roll back
updates it's hard to test the theory, could just be coincidence.
Regardless, things were humming along fine, then after an update
mutter no longer worked unless run from a terminal - it's hard to
imagine what sort of hardware failure would care whether or not my
window manager ran from startup scripts or a terminal, and the
issues happened after updates, so I'm leaning towards some sort of
software issue. The new software might simply not be fully
compatible with my cheap PNY/NVIDIA GeoForce 210 card using the
NVIDIA driver (the nouvaeu driver doesn't work right on this
machine configuration). Might be time to get a better video
card...
6/23/13 - Problem sorted (for now).. basically just reinstalled
the NVIDIA driver - should have thought of that but never had to
do that before - apparently some updates stomp on the driver
configuration (still weird that mutter would run from a terminal
but not from GUI startup). To track down the problem I found the
error in .xsession-errors and googled.. sure enough users
reporting Gnome Shell not starting after some update then
reinstalling the driver to fix it. The procedure is fairly simple
- ctrl-alt-F1 to get a text screen and log in, run "sudo service
lightdm stop" to kill X, change to the directory containing the
driver then sudo ./NVIDIA-Linux-[press tab to complete the
filename], follow the prompts to uninstall and reinstall
(including the dkms module) then sudo reboot (might help to cross
fingers). Part of the issue is probably because the driver is
closed-source so developers can't fully understand or fix all the
interactions... another part of the issue might be an
incompatibility with vesafb as noted in dmesg output but so far I
haven't found an easy way to disable it (lots of conflicting
low-level info and I'm kind of getting tired of breaking then
having to fix my system, plus without vesafb I'd likely uglify my
full-screen text terminals), yet another issue seems to be
poorly-tested updates that muck with stuff that should be left
alone. Perhaps the solution is still to get a better/more
compatible video card... but which one? [...]
7/6/13 - Disabling the conflicting vesafb driver is fairly
easy... edit /etc/default/grub (back it up first) and change the
lines...
GRUB_CMDLINE_LINUX_DEFAULT="quiet splash"
GRUB_CMDLINE_LINUX=""
...to...
GRUB_CMDLINE_LINUX_DEFAULT=""
GRUB_CMDLINE_LINUX="video=vesa:off vga=normal"
...then run update-grub to update the boot stuff. However, this
blows away the pretty Ubuntu splash screen while booting leaving
just a lot of text flying by (which doesn't bother me.. basically
the same as dmesg output), and also makes virtual full-screen
consoles (ctrl-alt-F1 etc) use a 80x24 screen instead of nice
smaller fonts. This gets rid of the vesafb warning from the NVIDIA
driver (making me feel a bit better) but otherwise can't tell any
difference... there is still the bug where the first double-click
of a session doesn't work (regardless of the GUI environment,
workaround is do it again or right after booting highlight the
file first - this bug does not happen on my HP mini 110 or under
VirtualBox), and VirtualBox won't run anything that uses mutter
(Gnome Shell, Cinnamon, custom sessions etc). I don't care that
much about the double-click bug (only an issue right after
booting) but it would be nice to fix the VirtualBox issue as
that's how I experiment with different configurations before
trying them on real hardware. There might not be a solution other
than get another video card.. seems that NVIDIA is not fully
compatible with Ubuntu (and likely other distros), the mutter
window manager seems to be picky about what it will run on, and
VirtualBox has always had issues running 3D or muttery things on
this machine (just went from semi-broken to fully-broken). Modern
PC's are very complex beasts with a huge amount of variation -
it's almost a miracle that modern software works at all -
especially free software - so not sweating a few bugs (just
documenting them), it's just a matter of figuring out what works
and what doesn't, fixing stuff if possible and learning to live
without fancy stuff that doesn't work and in the grand scheme of
things I don't really need anyway.
[update... got mutter and Gnome Shell working in VirtualBox
again.. see below]
On the other hand, VirtualBox works quite well for Windows 7 when
2D/3D accelleration is turned off and that's what really counts
since that's how I run Altium Designer to make circuit boards - it
must work or I'd be stuck with having to set up a dedicated
machine for it and I really don't want to do that. Especially
considering the last time I had a dedicated Windows box for it I
had bugs bugs bugs and it eventually crashed and burned taking out
the whole damn OS, not to mention the hassle of having to transfer
files between the Windows box and my main work machine. Running W7
virtually solved pretty much all the issues - backup is just a
matter of copying a directory, Altium Designer works MUCH better
when all it sees is plain video (sure enough if I enable VB's
2D/3D accelleration Altium crashes the OS as soon as it runs), and
VirtualBox's shared directory feature works fine for transferring
files back and forth. To run Windows 7 in a plain window without
VirtualBox's menus to maximize screen area I use a shortcut
launcher that runs the command...
VBoxSDL --termacpi --startvm "Win7pro"
...if for some reason I need to run it with VB's menus I can just
start it from the main VirtualBox program. I've been running W7
from a 40 gig virtual disk but space was getting tight and it was
time to attempt an upgrade to the new version of Altium (which due
to upgrade risks needs to be installed in parallel with the old
version), plus I might need to install other huge stuff like maybe
SolidWorks (big maybe, will have to see how it reacts to plain
video), so needed to increase the disk size. The existing virtual
disk was dynamic, so I could use the command...
VBoxManage modifyhd Win7pro.vdi --resize 80000
...(in the virtual machine directory) to resize the disk to about
80 gigs. Reportedly this doesn't work if the virtual disk was set
up to be a fixed size. This command only adds unallocated disk
space, to make use of the extra space I attached an Ubuntu live CD
image and booted it then used gparted to expand the Windows
partition into the extra space, removed the CD image from the
virtual CD drive and rebooted. Windows went through a CHKDSK thing
to verify the file system but otherwise didn't complain and now
I've got lots of extra disk space. Installing AD 13 went well,
seems to be an improvement over AD 10 but without changing much
(yes I hate change especially when I have work to do and have
already invested huge amounts of time learning how to make the
stuff work... the only changes I want to see are bug fixes and
improvements).
Other software notes...
Recently Google Earth started failing with "invalid http request"
when I entered an address, so got the new version 7 deb from the
Google Earth page and installed it. Didn't work... uninstalled it,
removed the .googleearth directory from my home dir, reinstalled
the deb, works now. Much improved, the previous version had really
ugly monospaced fonts but now it looks fine.
Sometimes I need to save video and audio playing on my computer
and for some things browser-based tools won't work. Previously I
tried to use a program called RecordMyDesktop but it never worked
right (no audio and now it doesn't work at all), but found
something called Kazam Screencaster in the repositories, works
great once I set it to use H264/MP4 encoding for video (VP8/WebM
didn't work) and "monitor of built-in stereo audio" (otherwise no
sound). The GUI makes it easy to select the area of the screen to
record, has an adjustable delay before starting and adds a "tray"
icon to gnome panel to stop the recording and prompt for a
filename to save the video. Haven't tested it much but seems to
work, kept up with 30 frames a second while encoding a 360x200
view window. I'm sure at some size point the frame rate will have
to be reduced, no option to dump raw uncompressed video.
7/8/13 - [edited] Was playing around and noticed that mutter was
still working on my other older/stuffed-up 12.04 VirtualBox VM
(but Gnome Shell just went to fallback)... I hadn't updated that
system in 2 months so backed it up then applied all updates
(expecting failure), but mutter still worked (but not Shell).
After reinstalling the VB guest additions (have to after a kernel
update to get a usable system), found that Gnome Shell worked too.
Cool. Updated my newer/less-stuffed-up 12.04 VM... didn't make any
difference, mutter still hopelessly garbled. Updated the VB guest
additions (from 4.2.12 to 4.2.14), now mutter works, so does Gnome
Shell (mostly.. on my newer 12.04 VM with splash/lightdm sometimes
it fails to launch, and in both VM's the overview still has a
garbled background but it has always done that and I don't use or
care about that feature). Basically the fix was to update the
guest additions.
9/7/13 - Ever since upgrading my main system from 10.04 to 12.04
it's had a minor bug when first starting the system - the first
double-click of a desktop icon failed unless I highlighted an icon
then clicked the background to unhighlight first (or just
double-clicked twice), after that it was fine but I noticed that
sometimes after running VirtualBox the bug would return. Didn't
matter what GUI or window manager I used. None of my other systems
did this (real or virtual) and couldn't find any reports of anyone
else having that problem, figured it was probably some X thing
related to my cheap video card, or maybe something to do with USB
or my cheap generic mouse and just expected and ignored it. When
it comes to complex operating systems, when that's the main bug
it's probably in pretty good shape! But still whenever the kernel
was updated I'd check to see if the double-click bug was fixed
(instead of doing the click dance after booting to prime the GUI),
and after today's update to 3.2.0-53 the bug seems to be gone. A
little thing but one step closer to perfection, a few seconds
saved each day. There is still an occasional apport system error
dialog that pops up because some service I usually don't need had
a problem but these tend to take care of themselves (and provide
the devs with debugging info), usually involves some leftover
cruft or service I don't need anyway, and it doesn't happen often
enough to apply the real fix (remove apport:-).
So now, as far as the GUI itself is concerned.. I can't think of
anything important I want to fix. Probably a rare moment that will
pass! Sure there are plenty of app bugs (that will probably always
be the case), but the combination of the Ubuntu 12.04 core,
Nautilus 3.4 and other Gnome 3.4 components, Gnome Panel, the
Mutter WM, a self-compiled indicator-sensors package (to get
semi-accurate CPU/NVIDIA temperature readings), and a few hacks to
the Ambiance theme, gives me a GUI that works pretty much exactly
like I want (works like Gnome 2) and doesn't slow me down with
unnecessary graphics effects.
10/21/13 - My initial double-click bug is back.. but whatever,
used to "priming" things by clicking around when I start up. I'm
thinking it's a result of some odd interaction between the NVIDIA
driver, the X system and/or Nautilus based on an observation after
an "incident". A couple of days ago applied a bunch of updates
including Xorg and afterwards nothing using OpenGL (mutter,
compiz, glxgears, etc) would work, had to boot into a plain
metacity gnome panel session (yay for multiple sessions!). No
double-click bug - a clue that it's something to do with the
NVIDIA driver. Problem was the update cleared my xorg.conf file
causing the NVIDIA driver to not be properly loaded. Not sure if
this is a bug as I'm currently using the NVIDIA driver from their
website, not the repository version which may (or may not) account
for this lack of consideration. Not knowing the cause yet I
proceeded to extend foot and shoot by installing the repo driver -
don't do that, result is no GUI and having to reinstall the driver
from a text console.. all I would have had to do is restore the
xorg.conf from a backup. To properly convert to the repo driver I
(theoretically) should run the installer from the website with the
--uninstall option then apt-get install nvidia-319 to get the
latest 3.19 version. But mucking around with video drivers is
nerve-racking so putting it off, it's working now.
The PPA version of LibreOffice 4.1 has broken EPS import. To make
docs for my circuits I need to extract the schematic from a PDF
file output by Altium Designer and get it into a rotated EPS file
to import into LibreOffice, using the following commands...
pdftops -noembtt -f 1 -l 1 "[name].pdf" "[name]_schem.ps"
ps2eps -R + < "[name]_schem.ps" > "[name]_schem_rotated.eps"
Some time ago, about when 4.1 came out, EPS import failed
with a "graphics filter not found" error, to get by I installed
OpenOffice from the website and associated it with DOC and ODT
files, but I like LibreOffice better. Bug report says fix
released, but not seeing it in the PPA.
Not sure what the holdup is, got work to do so removed the PPA
version and installed the website version of LibreOffice, problem
solved and it's off the update system. Newer versions of software
often do fix problems and work better, but for important apps I
need to be able to roll back updates if something breaks. An
example is SVG files, previously I had tried to import SVG
graphics of circuit boards produced by the gerbview program in an
attempt to make sharp zoomable docs, and it didn't work right
(missing pads etc). Now it works perfectly.
From the I Want To Keep My Obsolete Software That Works Just Fine department...
I still have a few dos apps and batch files that I use (mainly
QBasic stuff) which I run using DosEmu/FreeDos. Despite being over
2 decades over, for one-off stuff it's still easier to fire up
QBasic and just do it rather than editing, compiling, repeat until
it compiles, repeat until it produces the output I'm looking for -
with QBasic all that is in one app, having used it for a long time
I know it well, and when I'm working speed to answer counts, looks
don't even figure (nobody else will see it). Previously if I had
to process data I'd have to copy it to my DosEmu file tree, launch
DosEmu and do whatever, copy the data back to whereever I was
working then clean out the temp stuff. If I drop a symlink into
the DosEmu tree then dos apps can access that target dir, so I
wrote this "dosprompt" script...
#!/bin/bash # launch a dosemu dos prompt in the current directory dosroot=~/MYDOS # location of dos filesystem dosemu starts in td=dp.tmp # temp dir in dos file system pn=`pwd` mkdir $dosroot/$td ln -s "$pn" $dosroot/$td/cdir #link to current dir echo >$dosroot/$td/run.bat "@echo off" echo >>$dosroot/$td/run.bat "cls" echo >>$dosroot/$td/run.bat "cd \\$td\\cdir" xdosemu -E "\\$td\\run.bat" rm $dosroot/$td/run.bat rm $dosroot/$td/cdir rmdir $dosroot/$td
The script lives in my Nautilus Scripts directory so wherever I'm
at I can right-click, Scripts, dosprompt and it boots DosEmu with
that directory current (named dp.tmp\cdir but that doesn't
matter). DosEmu doesn't provide a way to launch in a particular
directory, so instead the script creates a batch file that changes
to the symlinked directory and runs the batch with DosEmu using
the -E option to tell it to stay running after the batch exits.
Most of the time I'm running QBasic programs so if the program
already exists it's nice to just run it (saves having to type
qbasic /run program.bas), so using similar techniques I made this
"qbasic_pause" script...
#!/bin/bash # runs a BASIC file in QBASIC using dosemu # current directory mapped so BASIC program will have # access to all files in the current dir and subdirs dosroot=~/MYDOS # location of dos filesystem dosemu starts in td=runbas.tmp # temp dir in dos file system if [ -e "$1" ]; then bn=`basename "$1"` pn=`dirname "$1"` if [ "$pn" == "." ]; then pn=`pwd`;fi # in case run from terminal if [ -n $bn ]; then mkdir $dosroot/$td ln -s "$pn" $dosroot/$td/dir.tmp #link to current dir echo >$dosroot/$td/runbas.bat "@echo off" echo >>$dosroot/$td/runbas.bat "cls" echo >>$dosroot/$td/runbas.bat "cd \\$td\\dir.tmp" echo >>$dosroot/$td/runbas.bat "qbasic /run $bn" echo >>$dosroot/$td/runbas.bat "echo." echo >>$dosroot/$td/runbas.bat "pause" xdosemu "$dosroot/$td/runbas.bat" rm $dosroot/$td/runbas.bat rm $dosroot/$td/dir.tmp rmdir $dosroot/$td fi fi
This one is designed to have BAS files associated to it but can
also be used as a Nautilus script. To perform the association I
used Assogiate (aka "File Types Editor") to create a mime type for
*.bas files with text/plain as the parent type, then used my
AddToApplications script to create a desktop file for the
qbasic_pause script so Gnome can be told to add it to
associations for *.bas files (I miss being able to associate
directly to a script but I guess that made too much sense..). Now
I can right-click a QBasic file and open with qbasic_pause to run
it with the current directory set to the location of the QBasic
program, so no problem finding data files.
True I probably spent more time figuring out these scripts (then
writing about them) than I save from using them.. but that's fine
since time spent when not working is "free" and when I do need to
use dos in the course of working I can just do it without being
distracted by extra file copies etc, for me that makes it well
worth the effort. If I could find modern software that works
better for quickly writing code (I didn't say good-looking code,
just code) I'd use it, but so far for simple input, process,
output stuff nothing I've found beats the old QBasic (QB64 doesn't
let me run it or a program from anywhere so it's out), and if I
need to distribute the app I can use FreeBasic to compile it for
Linux or Windows.
What else is going on...
I recently installed a mostly stock Ubuntu 12.04.3 (Unity and
all) to a friend's HP110 netbook - same model that I have but mine
usually runs a custom session I made using Nautilus and lxpanel.
We'll see how she adapts to Unity.. for normal-user
one-app-at-a-time use it actually works quite well, the global
menu saves vertical screen space. My main beefs with the global
menu are it's invisible unless the mouse is over it, and when an
app is in a smaller window then the menu should follow the window,
not stay at the top (when visible). In my opinion anyway. Added a
few essentials like flash and codecs, and for me Synaptic, removed
extra launcher icons leaving just the file manager and FireFox.
Had a minor issue when installing that required googling...
installed dual boot to keep the original (Ubuntu 8.04-based) HP
Mobile system, when it got to the partition size screen it didn't
label which side was which.. the new system is on the right so to
make it bigger have to slide the slider to the left. Otherwise
installation was uneventful and other than a black screen and
stray message when booting (didn't do the framebuffer fix) the new
system seems to work fine and the old system still works for a
backup. Sort of.
Got NetFlix under Ubuntu.. followed the directions
on the webupd8 site.. basically it's a plugin that uses a custom
version of wine to run Silverlite as if it were a native plugin.
Bloated but clever and so far works.. under Firefox once after
pausing I had to refresh the page to get it to continue but
otherwise no issues once I figured out the user agent string...the
User Agent Switcher plugin didn't work under either Chromium or
Firefox, the User Agent Overrider plugin works. Netflix also works
in Windows XP under VirtualBox in case the complicated wine-based
system goes down. But native is better. I really want something
like Apple TV and (when I want to) bypass my computer altogether..
kind of inconvenient to do anything else on a computer when it's
playing a full-screen video.. but still it's nice to be able to
play NetFlix videos. Got Hulu too, it's flash-based and has always
worked for me under Linux. Something will eventually have to be
done about that as flash is no longer supported under Linux -
unless using the Google Chrome browser (but pipelight now supports
running the Windows version of flash so that might be a solution
if the current Linux version of flash becomes unworkable). HTML 5
is proposing a controversial solution - basically a plugin system
for DRM videos but I'm guessing it'll only support closed systems
like Windows. What we really need is an open-source
platform-independent solution that lets content makers do their
DRM stuff but in any operating system. Yea that's a tall order..
but perhaps an interface layer with a "generic" x86 binary blob
within.. blob supplied by the service. Like almost every other
(especially Linux) user, I don't like DRM, but I do like good
content and if those providing the content demand DRM.. well here
we are. Thankfully there are clever hackers that can figure out
how to make stock Windows code run under Linux.
11/17/13 - Fooling around with Pear OS 8.. here's what my
VirtualBox install currently looks like...
The base system is Ubuntu 13.04 [corrected], running a customized
gnome-panel "fallback" session with a custom dock called "plank".
I added the "slanted e" icon next to the pear on the top bar, it's
a standard gnome-panel drop-down app menu because for me the stock
"launchpad" app launcher is full screen (sometimes, other times
under VB it's offset and doesn't show everything) and has no
categories.. not for me. To add the normal menu ran dconf-editor
went to org|gnome|gnome-panel|lockdown and unchecked locked-down,
then can alt-right-click and configure the panel. The global menu
can also be disabled/re-enabled this way. Many installed apps
aren't visible, had to run alacarte and unhide them to show in
launchpad or the gnome panel menu. The system is a bit flakey
running under VirtualBox.. sometimes the background goes away and
the window manager crashes. The plank doc seems a bit fragile,
when messing with the autohide settings it permanently hid
itself.. had to install gconf-editor and poke around to find the
setting to make it show again. It's a bit too easy to lose dock
icons.. for apps in a window easy to get back but if say launchpad
is accidently undocked then [...back up the launchers in
~/.config/plank/dock1/launchers, or go to /usr/share/applications
and drag the launchpad icon to the dock]. The dock looks cool but
if I were using this on a real system I'd uncheck it in startup
apps and replace it with lxpanel (that works, just add to startup
apps - makes it easy to select which panel to use) or add another
gnome-panel for a bottom task bar. Actually, if I did use this
setup on a real machine I'd wait for the next LTS, and then
probably start with a stock Ubuntu install and add in the bits
that I want - basically the theme. Still it's nice to see what
people can do with the Gnome 3 infrastructure.
Rather (besides just being curious), I'm using the system to see
what's changed in the new[er] Ubuntu. Perhaps in a slightly bent
format but that's OK as I bend my system up pretty good too. So
far other than some flakiness with the gala WM (based on mutter)
when running under VirtualBox (mutter is also flakey under VB),
and some usage issues with plank and launchpad - both optional
apps - the core system seems fairly stable. But there are a few
core issues - biggest one I've noticed so far videos play back in
black and white (not single-color, blended black and white) - [...
aha - it's an incompatibility with the new mutter.. do metacity
--replace and videos now in color.. in the new daily ubuntu doing
mutter --replace causes the same black and white bug. It's a
VirtualBox thing.]
Another issue that has an easy fix - I found that I could not
make custom associations by adding .desktop files to
~/.local/share/applications, turns out that the desktop file must
contain NoDisplay=false for it to show up in the association list
- fixed my AddToApplications script in the 5/20/12 entry. I can
see why - it makes the list more trim by eliminating probably
useless choices - but still a change from 12.04 that took some
figuring out. File associations are extremely important to me so
glad it was just a simple change.
In other happenings, updated my AmbianceMod
theme so that scroll bars in gtk2 apps will show up better.
11/18/13 - Here's a CreateShortcut script for nautilus-scripts
(derived from the AddToApplications script)...
#!/bin/bash # CreateShortcut - 11/18/2013 WTN # Makes desktop files for running scripts or binaries # Put in ~/.gnome2/nautilus-scripts or in a path directory # to use from a command line. Usage as a nautilus-script.. # Right-click executable, select Scripts CreateShortcut # Enter name for the shortcut, OK or cancel to quit # Select Yes or No when prompted to run app in a terminal # Launcher will be created without an icon, to select an # icon right-click, properties and browse for an icon. # prefix="launcher_" # new desktop files start with this if [ "$1" != "" ];then # if a command line parm specified.. execparm=$(which "$1") # see if it's a file in a path dir if [ "$execparm" == "" ];then # if not execparm=$(readlink -f "$1") # make sure the full path is specified fi if [ ! -x "$execparm" ];then # make sure it's an executable zenity --title "Create Shortcut" --error --text \ "The specified file is not executable." exit fi if echo "$execparm" | grep -q " ";then # filename has spaces execparm=\""$execparm"\" # so add quotes fi else # no parm specified, prompt for the Exec command # no error checking, whatever is entered is added to the Exec line execparm=$(zenity --title "Create Shortcut" --entry --text \ "Enter the command line for the launcher") fi if [ "$execparm" == "" ];then exit;fi nameparm=$(zenity --title "Create Shortcut" --entry --text \ "Enter a name for this launcher") if [ "$nameparm" == "" ];then exit;fi if zenity --title "Create Shortcut" --question --text \ "Run the app in a terminal?";then termparm="true" else termparm="false" fi # now create the desktop file filename="$prefix$nameparm.desktop" echo > "$filename" echo >> "$filename" "[Desktop Entry]" echo >> "$filename" "Type=Application" echo >> "$filename" "NoDisplay=false" echo >> "$filename" "Name=$nameparm" echo >> "$filename" "Exec=$execparm %f" echo >> "$filename" "Terminal=$termparm" chmod +x "$filename" #make executable
Another way to make launchers is this script (what I used to
use)...
#!/bin/bash gnome-desktop-item-edit --create-new $(pwd) &
...however that requires gnome-panel and makes me browse for the
executable. I thought hmm.. would be easier to just enter a name
and choose terminal or no terminal, almost exactly what the
AddToApplications script does except put the desktop file in the
current directory and make it executable, I'll drag it to where I
want it. A few minutes of editing, done. It doesn't set the icon
(on my system the default is the same as for binaries), easy
enough to right-click properties and pick an icon if I want
something different.
11/19/13 - A few things. I've been tempted a time or two to set
up a normal blog where each "thought" gets its own file - but for
now probably not. Several reasons come to mind.. My notes files
are references on how I set up my systems (primarily for me) and
it works better when it's one big page so I can scan through it
and find my favorite scripts etc. Regular blogs tend to get
regurgitated ad nausium through the blogosphere and while I don't
imagine my humble words being important enough to garner such
treatment, I also don't want to encourage it either, especially
when I might have something "political" to say about Ubuntu etc -
when I write such things it's to document things and what I think
about them in the context of my other notes, not to be picked out
in isolation (that said, I don't care if anyone copy/pastes parts
so long as attributed with a link back to the source page.. that
at least preserves the overall context, not to mention I might
change my mind or append to entries). Finally, it's a hellofalot
easier for me to just right-click open with seamonkey than to mess
around with blogging software - not that it won't happen in the
future but right now I'm fine with the all-in-on-file format.
TL;DR - then don't read it, I'm not writing for the fast crowd.
Lately Ubuntu has been taking quite a bit of flak.. occasionally
deserved but mostly simply because Canonical is a business and
they have to do business stuff. Like enforce trademarks and
generate revenue and insulate themselves from upstream changes.
Recently there was a somewhat overreaching C&D regarding
www.fixubuntu.com that in my opinion was resolved nicely - site
was fixed, apology issued, should be end of story. The shame is it
involved a media storm, I would like to think that in the future
such disputes can be handled more intelligently.
Generating revenue - well that's what companies have to do, if
you don't like the way Ubuntu does it then read my notes here or
go to the afore-mentioned site for a fix. The Ubuntu repositories
carry a number of alternative desktop environments and Mark and Co
have always supported the freedom to install and uninstall
whatever software the user wants or doesn't want. Personally I
don't use Unity (except occasionally the QT version on demand) and
I really don't care if some searches are also sent to the internet
(I find that useful for finding videos) even if to generate Amazon
ads. A setting was provided to turn that off, plenty of other ways
to disable that behavior, and while some of the criticism was
warrented (resulting in a global on/off switch and other ways to
control what gets sent outside the machine) we type stuff into
Google or a number of other web sites and have stupid ads follow
us around all the time without worrying about it. Ubuntu's version
is mostly anonomized, the rest of the internet most certainly
isn't. I'm more concerned about sites like Netflix and Hulu
thinking I like certain things for all to see that see my TV
because I or someone else clicked something.. but even that's like
whatever, I still love my little Roku box (much better than
streaming to my PC) and occasionally those web sites get it right
and offer up something that I really do like. If Amazon ad
placements generates Canonical a few bucks then that's fine by me
- it helps the company keep the repositories up and updated which
helps me because if Canonical/Ubuntu (and the infrastructure they
provide) went away that would just suck. Keep in mind that a large
number of derivitive distributions (including Mint) also tap into
Ubuntu's repositories.. despite naming differences, Mint and Pear
OS and many other derivitives ARE Ubuntu, whether officially
sanctioned or not. I also think (and this might be a controversial
thought) that derivitives with large user bases and operating
income that tap into Ubuntu's infrastructure should help with the
expense of maintaining that infrastructure by at least paying for
the additional bandwidth imposed by their users.. if some popular
website started in-lining my images and increased my internet bill
I'd want compensation or some other resolution, derivitive distros
should be no different. We should all share in the expense for the
greatness that has been freely provided to us, in however way we
can, lest it go away.
Then there is insulating one's self from upstream changes. Mir
and Upstart cause a lot of controversy, the argument being why
didn't Ubuntu go with Wayland and systemd and instead created
their own solution. Well one reason is at the time the decisions
were made, the latter projects were not complete enough. But
another reason is so that Canonical can be free to do what it
needs to do.. sometimes (actually often) the goals of upstream
projects don't align with what a user wants - a distro is a user
of upstream. When that happens, or if it's something very
important, then the user must either fork or develop their own
solution that isn't affected by upstream. Gnome went nuts with
their shell, thus Unity. As far as I can tell Upstart was put in
place before systemd was ready. Development of Mir started before
Wayland was ready. Yet it's kind of funny that poking around in
Pear OS 8, based on Ubuntu 13.10 [sort of], that I find bits of
Wayland and systemd already installed - how odd given all the
hype. Is that true of the official image? [downloading daily...
installing... why yes it is. So like wow.] Everyone just needs to
chill, stick to actual issues instead of personal preferences, try
to be kind to one another despite differences, and stop being
surprised that a company acts like a... company.
11/21/13 - Hacking around with Ubuntu "trusty" testing in
VirtualBox (4.3.2)... [try 2]
This is a customized gnome-panel session with metacity,
compositing enabled. Originally I configured
it with the mutter window manager but that causes bugs with
video playback when running under VirtualBox.. metacity with
compositing looks/works almost as good other than the old metacity
bug where desktop icons leave trails when moved.. and having to
hit window borders exactly to resize - mutter gives me a bigger
target. Had to upgrade to VirtualBox 4.3.2 to make the shared
folder option work - as usual installed the guest additions and
gnome-system-tools to get the Users and Groups app and added
myself to the vboxsf group. Besides installing gnome-panel to get
the "Flashback (no effects)" session, also installed synaptic,
dconf-editor, gconf-editor, gnome-tweak-tool, gksu and alacarte
(plus other stuff but those are the main things). To make it more
compact for this virtual testing system, set up gnome panel to use
just a single bottom panel with the compact menu, a few launchers,
run dialog, close all windows, window list, notification area and
a clock. To get a volume control in this minimal setup added a
script to the startup apps that selectively runs
gnome-sound-applet if the flashback session is running (right now
$DESKTOP_SESSION returns "gnome-fallback" but that might change).
Used gnome-tweak-tool to enable icons in buttons and menus, used
dconf-editor to enable compositing in metacity, revert to normal
scroll bars, among things. A few things still use gconf but the
dconf settings takes precedence.
For the most part the system is "normal" but there are some
oddities/changes... The right-click/create new file option doesn't
work in the Flashback session unless something is put in
~/Templates, anything there makes the create empty file option
show (this isn't an issue in Unity). The nautilus scripts folder
has moved, now ~/.local/share/nautilus/scripts. Script options
don't show unless a file or directory is selected, that's dumb IMO
since it makes it harder to launch scripts that just need to run
in the current directory (Terminal, BrowseAsRoot, MakeShortcut
with no parm so it'll prompt, etc). The apport crash reporter is a
bit hypersensitive.. sometimes it notices everything I broke or
stopped then showers me the next time I log in.. but last time I
tested a testing system it was like that, got dialed back before
release. This is all totally pre-alpha so lots of stuff will
likely break and change and get fixed before it finally becomes
Ubuntu 14.04... but so far I like it [except for Nautilus not
showing scripts when no file is selected.. that's an inconvenient
usage regression - also gnome panel doesn't space launcher icons
as closely as the version in 12.04 - and a stupid
bug where the loading of nautilus and indicators are blocked
for about a minute, workaround is to add the script
/usr/local/bin/gnome-panel containing the command "gnome-panel
&" to permit whatever is improperly calling gnome-panel to
continue. This rather obvious bug was reported 12/2/13 and still
hasn't been fixed by 1/17/14, not exactly encouraging but at least
there's a simple fix...].
1/10/14 - Happy New Year... time for another borked x11/mesa
update to kill mutter :-) On the forums one recommendation was to
install the "raring" backported kernel/graphics stack but that
wanted to remove stuff I really need.. like freecad, wine,
multiarch support, etc.. no thanks. I've suspected for awhile that
my manually installed NVIDIA 319 driver was causing issues (see
10/21/13 entry.. the last blowup) so logged out, ran sudo service
stop lightdm to kill X, ran the downloaded NVIDIA driver package
with the --uninstall option, sudo apt-get install nvidia-current
nvidia-settings (which installs version 304), rebooted.. problem
solved. Hopefully for good. Bonuses.. the initial double-click bug
seems to be gone [but now it's back.. ugh], and google-earth works
again (for now.. lately it had been crashing on startup but that
program has always been kind of flaky).
Last month I had to do some audio processing work... My
fiddle-playing friend Holly's Alesis HD24 recorder went down -
first thing I did was use hd24connect from hd24tools to make an
image copy of the drive and verify that the tracks were OK - there
is a Linux version but it has issues, won't play the tracks and
every time a new tab is selected have to move the window to
refresh the window.. for previewing tracks had to use the Windows
version running under VirtualBox with a symlink to the image added
to the shared files directory. But the Linux version works for
imaging and extracting tracks to wav files. The image file is just
a straight byte-for-byte copy, have to run hd24connect as root to
access the drive placed in a USB drive tray.. afterwards reset the
image file permissions so I wouldn't have to run hd24connect as
root for extracting. Used dd to copy the drive image to a new
drive (same type and capacity.. an 80 gig WD800).. problem with
the machine ended up being bad contacts, loosened and cleaned the
bay connectors (common issue) and added some capacitors (common
update), machine works fine now, finished the sessions.
I figured at some point I'd be called upon for mixing, so found a
program called Ardour. Version
2.8.12 was in the Ubuntu Precise repository so installed that and
played around with it enough to figure out how to make stuff work.
Sure enough I was tapped for mixing one of the songs with a lot of
tedious automation like piecing together multiple solo takes
together to make one - the software did well, it was able to do
everything requested by the artist. Here's the program in
operation (click for full size)...
For me as a musician and working with other musicians, this is a
game-changer! Until last month, we were mostly at the mercy of
professionals who besides charging for their services (one way or
another), often had their own ideas about mixing that made it more
difficult to get the product we want. Call it producer's syndrom..
often professional mixers are very talented and mean well, but
fail to realize that we're the musicians, already know what we
want, and have to live with the results. I don't need a producer,
I need someone (or some thing) that will do what we want without
injecting what they want. So this is a game changer... and I
didn't even have to set up a mac running ProTools. Now all we need
studios for is to cut the tracks, say thank you and leave with the
wave files.
Ardour has two main windows... an editor window showing wave
views of the tracks, and a mixer window where it is more
convenient to operate the controls and add effects pre and post
fader. Tons of processing and effects are included and almost any
parameter can be automated using an automation timeline. For
automating cuts and stuff I found it more convenient to add pre
amplifiers to the tracks and automate the gains (can draw points
directly on the track timeline, sloping between the notes to avoid
clicks), leaving the faders free for mixing. Would be better to
run on a dual monitor setup to avoid having to go back and forth
but that really wasn't a problem - I have a Gnome-2-like setup so
just click click.
There are bugs but all had easy workarounds. When saving effect
settings to a patchname, the first parameter isn't restored
(workaround - set 1st parm manually, current settings are saved
just fine when saving the session, only affects saving to a
patchname). When using "touch" fader automation sometimes it
caused a huge spike (workaround - do it again but automating amp
gains works better anyway). When the beginning mark is moved down
the timeline, the program would crash when exporting (workaround -
always click the seek to beginning when reloading a session or
before exporting [if it still crashes playing a little past the
start usually fixes]). I'm used to the quirks now, I just
automatically compensate.
This 2.8 version is pretty old so thought I'd try the new 3.5
version. Ardour is "donate-ware", to download the new binaries
requires a payment - but you get to decide how much, plenty fair.
Ardour can run from its own directory without actually installing
so no need to uninstall or replace the repository version, can
just unpack it and run it. In theory... didn't work for me, the
log gave jack errors about not being able to use realtime
scheduling. After some googling the fix was to add the file
/etc/security/limits.d/99-realtime.conf containing...
@realtime - rtprio 99
@realtime - memlock unlimited
...then run the command sudo groupadd realtime followed by sudo
usermod -a -G realtime myusername. Also added myself to the audio
group and some other groups that looked like it might have an
effect before finding the real fix. But there are still some usage
issues which I won't get much into here - some are just
learning how the new version works, others are just that I usually
like older simpler software better unless the new version does
something I need that the old version doesn't do. For now version
2.8 works for me, and like I said, it's a game-changer.
3/20/14 - Ardour 3.5.357...
Sure looks good! Mostly I've been using the Ardour 2.8.12 that's
in the Ubuntu 12.04 repository, but trying to adapt to the new
version 3. The '357 update seems a lot more stable, just did a
quick project mix with it and didn't run into any real issues - no
crashers. The main issue I had this time around was with the
position box at the bottom.. if grabbed wrong it resizes the track
strips and not in a useful way, in my opinion all it should do is
move the view. Scrolls quickly when dragging the position marker
so learned not to touch that thing (or just disable it). Coming
along... it still seems like a friggin' miracle to be able to do
my own mixing without a "producer" telling us how our music is
supposed to sound.. now we can cut our stuff get wav files see ya
bye. After getting a mix I run the Ardour export through Audacity
to do the mastering - usually shave off the high peaks and
normalize but sometimes overall EQ too [however this isn't good
enough for final production, for that found something called
JAMin, page down] - then export to a "lame" (legal) MP3 at the max
rate of 320K. Even if going to a CD I still prefer the MP3
conversion as it tends to get rid of stuff I don't want to hear
anyway (in my opinion, I'm sure other audio folk will disagree),
and let the Brasero burner app do the bitrate conversion and
leveling. Sure has come a long way in the 30 or so years I've been
messing around with studio audio.. the big expensive tape machines
consoles and outboard gear have mostly been replaced with digital
converters and PC's. At least the mics and actual musical gear is
about the same, still playing through old tube amps. It took me a
long time (years) to get on board with the new digital ways - my
initial experiences with software-based studios were not good..
out-of-time tracks, crashed computers, and when it worked the
mixes tended to sound too processed... but this time around
recording with ProTools and mixing with Ardour I've had no issues
whatsoever with latency, perfect feel, can make it as raw or
processed as I want. Now I'm convinced the new ways really can be
better - it's all about how the new tech is used.
Although... have to be careful about getting too whiz-bang with
the stuff or the resulting software bugs can shift it back the
other way.
In other software dealings... been tracking Ubuntu 14.04
development and not sure I like what I'm seeing [previous notes
removed.. see below].
4/15/14 - For some odd reason the new Ardour 3 started crashing
bad, more often than not if I select a track strip when the song
is playing it makes a loud noise then closes. Fills up my
xsession.errors file with megabytes of GTK warnings and Jack
errors like Process error and Broken pipe. Had to go back to
Ardour 3.5.74 to fix the crash issue and finish the mixes (latest
is 3.5.357, 3.5.143 still has the issue). Something changed,
didn't crash before, and no mention of it in the forums or bug
tracker about it so assuming it's something to do with my system.
Most likely with Jack.. Ubuntu 12.04 uses jack 1.9.8 which is
known to be problematic, unfortunately several other things depend
on it. But it's not all Jack.. something the software is doing is
bigtime stressing the system (maybe it can't keep up with all the
error logging going on). Ardour 2 works so probably will stick
with the old repo version for new mixes, at least until this gets
sorted.
Funny.. the nice thing about open source software (at least when
manually installed) is I can install whatever version I want to
find what works. The guy with the ProTools studio we record at
recently upgraded and is having all sorts of crash issues, and
revokes the license for the older version, he's stuck with it
unless he buys a new license for the version he was using (if
that's even possible). Now he has to save constantly to avoid
losing work, at least when Ardour crashes it can usually recover
unsaved changes when the session is reopened. Besides being able
to freely reinstall the older version if there's a problem - I
wish the regular software repositories had this feature! but that
wouldn't be practical.
Ubuntu 14.04 has improved [sort of, the lightdm login manager
still can't scroll, won't take KB entry, and can handle only so
many entries before some push off the screen] but the new Nautilus
3.10 file manager sucks for what I do. It no longer has a context
option to open folders in a new window in the usual place, have to
either add a script to do that or right-click the location buttons
(I guess they forgot to remove it from there - but that means to
open a folder in a new window I have to go there first then open
the parent in a new window.. that's just goofy), can't run scripts
unless something is actually selected (makes no sense for scripts
that simply need to run in a particular directory). Fortunately
there's relief - Nemo is available in the default repository and
it has none of these issues. Plus the handy extra pane feature.
When installed it replaces nautilus as the default file manager so
nothing else needs doing. The new Nautilus was my main blocker, so
with that out of the way I can at least think about upgrading at
some point.. after the dust settles. I would like to figure out
how to get gnome-panel to space app icons closer together but
that's not a huge deal.
7/12/14 - Ubuntu 14.04 in VirtualBox with metacity, lxpanel and
nemo...
Ubuntu 14.04 with metacity, gnome-panel and nemo...
...still working on the theming (it ain't easy! no docs!!!). The
gnome-panel.css is stock ambiance except for commenting out the
border-image tags for the buttons. The gtk-widgets.css is hacked
from my modified ambiance theme so the scroll bars will be
visible. Not crazy about the button theme, cuts off the text
decenders but don't know what to do about that, replacing with the
stock widgets file didn't change that and broke other stuff,
besides making the scroll bars nearly invisible again. Making the
panel taller helps with the text but makes the icon spacing issue
worse. One thing I noticed though... an earlier screenshot of
12.04 also showed the icon spacing issue, which went away. Not
sure if from an update or if I accidently hacked something. I
tried running 14.04 with the stock ambiance.. no go. Same issues,
same broken look with radio buttons...
I think I'll stick with my modified version of Ambiance. [...]
7/14/14 - Computer users can be so stubborn.. especially those of
us who rely on computers for their job. To me, the OS is that
thing standing between me and what I need to do. It takes a
considerable amount of effort to figure out how to efficiently do
the things I need to do, find the right combination of user
interface software, script stuff so I have the right-clicks I
need, get my panel set up with the shortcuts and information I
need... once it's all working then I'm working and doing my thing
and I don't want my GUI to functionally change. Preferably, ever.
Sure I want newer better apps, improved appearance, better
stability, but functionally I don't want anything to change unless
it truly helps me do my job better.. and if I do get on board with
something new, I want to know it's going to stay about the same
for many years, or it's not worth my time to learn how to make it
work. New shells, new ways of doing things, are fine for new
users, or users with certain needs - app-centric is great for
folks who tend to run one app at a time, but that ain't me. If I
want an app to be full-screen, there's a button for that, but
usually I have multiple instances of multiple apps open at once
and right now it takes just one click to pick the window I want to
see. Any OS that complicates that simple thing I do hundreds of
times a day is a non-starter no matter how good it looks - the OS
isn't the app! I just want the OS to do what I tell it to, look
good enough to not distract me, and otherwise stay out of my way.
And I darn sure don't want features I've come to rely on to be
removed to "simplify" things (it has the opposite effect).
Fortunately at least some developers get that, recently there's
been a surge of new desktop environments and components that work
"the old way". Plus this is Linux - it will conform or I will hack
it into submission. OK.. a certain amount of GUI hacking is fun...
but once I got it the way I want, I want it to stay that way.
Ubuntu LTS releases are that way - after an upgrade it might take
some effort to get it the way I want, but I don't have to worry
about some essential component or feature going away. The downside
is after awhile it gets harder to run newer apps - PPA's help but
only to the point the pinned library versions let it happen.
I'm tempted to clone my existing HD and try a 14.04 upgrade...
it'll take a bit of cosmetic work but at least that'll let me keep
gnome-panel for another five years and I won't need to
fundamentally change my work flow. Or by the time 16.04 comes
around perhaps the Mate version will be official.
7/21/14 - Mastering with Ardour (2), JAMin and TimeMachine...
Mastering is the process of preparing a song or other audio
material for consumption. Uncompressed mixes can sound great, but
even though the peaks are near clipping, it just doesn't sound
"loud" compared to commercial productions. Unless doing symphony
stuff, have to pump up the volume, generally by applying
multi-band compression and fast limiting but doing it in a way
that doesn't cause undesirable effects. Frequency equalization
(EQ) may need to be applied, but as I have the source tracks I try
to mix the stuff so that little or no overall EQ is needed. For
test mixes I was simply using Audacity to limit the peaks then
normalizing the volume, gaining about 3db, but that's not good
enough... need more like 7-10db average volume level increase to
compete with popular music. I'm not sure it can be done with
Audacity (I don't see a multiband compresser plugin), even if it
could be the process would be tedious. Besides on my system
Audacity is very crash prone (sometimes even playing a track
causes it to lock up) - usually I just use it to import, trim and
other simple tasks that can be done visually, then export to the
needed format. Plus I need something that works in real time so I
can hear the results - often mix changes are needed as the
increased volume tends to bring out flaws that would otherwise go
unnoticed.
Enter JAMin
- this app (available in the 12.04 repository) connects to a JACK
source and outputs to the system audio and/or another JACK app,
and processes whatever runs through it. To use with Ardour
disconnect the master output from the system audio then in JAMin
select Ardour for the input port (sometimes have to also make sure
the left and right inputs come from Ardour's master outputs and
not from a track). This kind of interconnection flexibility is
where JACK shines.. any output from a JACK-enabled app can connect
to any input of another JACK app. Now Ardour plays through JAMin,
the defaults are pretty much spot-on to my ears, just increase the
input level until the desired apparent volume is achieved. I also
drop the limit level to about -0.4db to avoid touching the peak
point. Now just have to capture the audio to an audio file.. I had
no luck using the usual suspects (Audacity, sound recorder etc)
but searched for JACK in the synaptic package manager and found
something called "Time Machine" - it's main function is to record
any JACK audio fed to it with a 10 second pre-buffer to avoid
missing stuff but it works great as a general-purpose JACK
recorder. When the button is pressed it dumps the audio into a wav
file (by default) in the home directory named tm plus date and
time info, press the button to stop recording. Once running just
select Time Machine as an output in the JAMin ports setup. To
avoid excess silence at the beginning I used Alacarte (main menu)
to add -t 1 to the command line to specify a 1 second pre-buffer.
The resulting wav file can be imported into Audacity, trimmed, and
exported to the needed format - for me usually 320K lame mp3 but
for a final can be flac or wav - but at that bit rate I can't hear
any difference. This tool flow also solves another major issue I
was having - Ardour's tendency to lock up when exporting -
everything works in real time so it doesn't stress the system. I
do everything in Ardour 2 now, version 3 just doesn't work on my
system but other than export, version 2 from the repository works
great.
One thing about JACK... when JACK apps are running it keeps
non-JACK music players from working (I'd love to find a solution
for that!), so to check the exported mp3 file without closing the
apps I installed a command-line app called mpg123 and made a
simple mpg123jack script...
#!/bin/bash mpg123 -o jack "$1"
...used my AddToApplications script to make a ".desktop" file for
it, specifying run in a terminal, then added it to the
associations for mp3 files. It tends to cut off the very end of
the sound if not enough trailing silence - but for now it works,
until I find a JACK-aware music player that works without fiddling
stuff. Installing the JACK plugin for VLC wasn't enough to make it
"just" work, as in play the darn file when I right-click and play
it.
Once again, I'm amazed by what I can now do on my own using only
free software. This certainly isn't the only way to do it, some
folk don't like
what they get using JAMin and prefer using DSP plugins to do it
all in Ardour. Thing is though, with audio production there are
about as many opinions as those doing it. That's what I'm trying
to avoid by doing it myself! In my experience, I get far better
results mixing my own stuff using tools I select, than letting
someone else unfamiliar with my music do it for me. No matter how
good their stuff is. Here's a nice
tutorial about using JAMin - but be careful, I suspect a lot
of issues people have is from turning too many knobs - if you can
hear the compression, it's too much.
9/1/14
A little mystery... one of the things I occasionally run in
dosemu is an old The Need For Speed demo.. call it stress relief.
But recently I went to run it and all I got was:
DOS/16M error: [17] system software does not follow VCPI or DPMI specifications
Used to work.. and it does work if I boot using kernel 3.2.0-63
or earlier but doesn't work with later kernels. After some digging
found
this with the solution. Added the file
"/etc/sysctl.d/ldt16.conf" containing the line "abi.ldt16 = 1",
TNFS and other dpmi apps under dosemu work fine now. The issue was
(apparently) a potential information leak, but pretty much every
non-sandboxed task running on my system under my user permissions
already has at least read-only access to all my files... if there
were a malicious task running on my system that could take
advantage of this, well it would be pretty much over anyway so not
concerned about something being able to peek at bits of my memory.
Of course for multi-user systems where users might do anything,
something like that is a much bigger deal and bugs like that
should be fixed. I'm the only one using my system and just want to
run dos games, and I'm thankful an "off" switch was provided (by
Linus) until there's a more proper solution. Used to have to set
/proc/sys/vm/mmap_min_addr to 0 to make dosemu work right -
eventually that got fixed (in dosemu itself - it now emulates vm86
mode).
16-bit software just won't die... and why should it? To this day
nobody has come up with anything modern that lets me make
computational programs as fast and easy as just running QBASIC
under freedos/dosemu.. which wasn't affected by this latest kernel
bug-fix so took awhile to notice - this time only extended memory
programs (games) were affected. Occasionally developers assume
nobody runs old code anymore, makes some change that breaks old
userspace code, old-fashioned guys like me notice, workaround
issued... sounds like the system is working just fine! The day may
come that nobody cares anymore, but until then Linux supports old
Dos and Windows programs better than Microsoft does.
The Crash
A couple of weeks ago I was working on a script that searched
files for certain strings, and noticed several files showed I/O
errors.. ran GSmartControl and although it gave the my main 500
gig WD drive a pass on basic health, under attributes it showed
over 50 unrecoverable bad blocks. Ouch. For the most part the
system seemed to work fine, the damage seemed to be restricted to
old archived files I haven't accessed in a long time. Modern hard
drives are supposed to self-correct - when a read error occurs,
the next time the block is read successfully (possibly on retry
from the same request?) the data is mapped to a fresh block and
the bad block is added to the bad blocks list (some sources say
bad blocks are remapped only if written to).. so for the most part
users never notice blocks going bad. Unless a block fades away
from lack of use.. read it or lose it? Lots of questions and lack
of information, but it was clear the disk was going bad. So got a
Toshiba 2 terabyte drive for about $100 (like wow..), downloaded
Ubuntu 14.04 and put it on a thumbdrive, swapped the drives around
so /dev/sda would be the new drive and /dev/sdc would be the old
failing drive (/dev/sdb is my data drive). On my first attempt to
save my old system I tried to use dd to copy the entire drive to
the new drive then use fdisk to fix the partition... which might
have worked except for 3 things: I forgot to include sync in the
conv=noerror option so it horribly corrupted the file system; the
new drive uses 4096 byte sectors and of course a partition
boundary split a sector; and (as I figured out later) the UUID
stayed the same and the system gets horribly confused when 2 disks
have the same UUID.
So... installed Ubuntu 14.04 using the "do something else" option
and partitioned the new drive with a ~500 gig partition for the
new 14.04 system, a ~500 gig partition for my old 12.04 system
(assuming I could save it..), about ~10 gigs of swap, and the rest
/home for the new 14.04 install. My old system showed up as
/dev/sda2 so when partitioning arranged it so the old system will
remain on sda2 - not that that mattered. By this time I'm almost
starting to panic a bit, the old system got to where it would no
longer boot reliably (likely due to the UUID mixup I hadn't
figured out yet)... I had backups of all important data, but the
prospect of having to set up the operating system and reinstall
all the apps I use was not sitting well with me (oh it's all shiny
and fresh and boots so fast now... and does almost nothing I need
to do). So.. mounted my old failing disk partition to /mnt/src,
the new sda2 partition I made to /mnt/dest, and did cp -axv
/mnt/src/ /mnt/dest to copy everything it could to the new drive.
Should have left out the v option so I would see only damaged
files.. but at the time seeing all my stuff being restored was
comforting. Looked like all the system stuff was ok. Of course it
wouldn't boot.. UUID stuff again. So (after some googling to
figure out what I was doing) back in 14.04 I mounted the partition
with a copy of my system, did mount --bind commands to add /dev
/proc and /sys, did chroot to the new system, nano /etc/fstab and
fixed the UUID's for / and swap, did update-initramfs -u -k all,
did update-grub (from inside the chroot) - darn thing still
wouldn't boot.. ended up at the busybox prompt. Taking a hint from
the displayed text did cat /proc/cmdline - aha still was trying to
boot with the old disk UUID (something didn't get fully updated),
rebooted and manually fixed the incorrect UUID and then it booted
right up (yay!!!). Poked around, the damaged user files were
obvious in the file manager (marked with an X) and it looked like
all the system files were fine, so did sudo update-grub followed
by grub-install and that fixed the grub menu (replacing it with
the one from 12.04 with 12.04 being default), boots fine now.
Removed the damaged drive and replaced the damaged files from
backups.
That was close. While losing my 12.04 system wouldn't have been a
total disaster, it would have slowed down my work considerably
until I got everything reinstalled and reconfigured in the new
system. Backing up just "important" stuff is not sufficient, I
want a copy of every single thing! All (at the moment) 1.3
Terabytes of it. So ordered two more of those 2T Toshiba drives,
hooked up one as /dev/sdc, partitioned it as one big volume, made
two directories on it (ubuntu1204 and datadisk) and used cp -ax to
copy everything from both my main system and data disk to the
appropriate directories. To keep the copy updated I made a script
containing rsync -axv --delete --exclude=.gvfs /
/mnt/bk/ubuntu1204 to for the main system (it mounts /dev/sdc1 to
/mnt/bk first, the exclude is because of a bug that makes the
~/.gvfs directory unreadable by root and if any error occurs rsync
won't process deletes), and rsync -avx --delete /mnt/datadisk/
/mnt/bk/datadisk to back up my data disk (which stays mounted to
/mnt/datadisk via fstab). The script has checks to make sure the
ubuntu1204 and datadisk directories exist on the backup first in
case something changes the disk order and /dev/sdc1 ends being up
some other disk. Every month or so will swap the backup disks.
And maybe I'll start fixing up 14.04.. so far all I've done to it
is install the "flashback" session and the nvidia driver which
triggered having to do other fixes (framebuffer stuff to avoid
cosmetic boot glitches, same as for 12.04). But I'm in no hurry...
for now my highly-tuned 12.04 system works for me.
10/14/14 - Various Random Stuff...
Still haven't done anything more to my 14.04 install.. other than
stealing some of its huge home directory for "off line" file
storage (VM backups etc) to give a bit of breathing room on my
12.04 system. 14.04 just isn't doing it for me.. doesn't really
add anything new and I don't like the regressions in lightdm
(limits how many sessions), nautilus (sucks now, have to replace
with nemo), gnome-panel (too much space between panel icons), etc.
It's there if I need it in case I need to take 12.04 off-line to
work on it, or run into something that really needs 14.04 (maybe
for Lightworks? a professional-grade video editor would be nice..
but if I had that it would probably increase my workload... less
work if I just stick to OpenShot :-).
As of kernel update 3.2.0-70 a few days ago, dpmi programs under
dosemu no longer work.. again. The new 3.2.63 upstream kernel
removed the ltd16 workaround and was supposed to include a proper
fix (espfix64) but looks like it doesn't work for whatever reason.
I feel like a dinosaur... but I'm not
alone. At least QBasic still works.. that's an almost
mission-critical app for me (still the fastest/easiest way for me
to make calculation programs). [was a configuration issue, kernel
update 3.2.0-72 fixed the problem.]
The last few months have been tough for open-source security..
first the "heartbleed" bug, more recently the "shellshock" bash
bug. Heartbleed was a bugger as it forced almost everyone to have
to update passwords. Shellshock was just stupid... basically a
combination of setting environment variables to untrusted input
from remote users (what??? really???), bash allowing evars to
contain functions (a dubious feature I didn't know about), then
system software calling bash to do stuff that should be done with
a true sh equivalent, not a shell meant only for local users. But
for the most part it should all be fixed now, there were some
breaches but compared to all the other breaches going on these
days, not too bad. Just another reason why web-facing services
need to be kept up-to-date. And the big lesson - watch what goes
into environment variables! In my opinion functions-in-evars
should be totally disabled but that would break a few things that
use that "feature" so the patches work around it by only allowing
code in specially-named variables.
Of course some have used these bugs to attack open-source
software, but I don't buy it - the system worked just as it should
- bugs found, bugs fixed, move on. Does kind of dispell the notion
that with open-source bugs will be found sooner because of all the
eyes on it... but I never really believed that anyway. All
significant software has bugs, and systems need to be set up to
assume that as much as possible.
Speaking of environment variables... for a long time I've had
LESS_TERMCAP variables that contain ANSI escape sequences that
colorize the output of the less command.. but they also mess up
the output of the env command, so I had an alias for env that ran
it through grep to remove the offending variables. I didn't know
it until all this shellshock stuff came around, but that broke
using env with parameters at the command line, causing the
env-based shellshock tests to fail. So inspired by one of the
bug-tests, I replaced the env alias with this function in my
.bashrc file...
# normal env command will colorize output, redefine the env command...
env() { /usr/bin/env "$@" | sed "s/\x1B/[esc]/g"; }
...this runs the actual env command with any supplied parameters,
then passes the output through a sed command that replaces any
ascii-27 characters with [esc] so ANSI escape sequences won't mess
up the output. Scary bug... learning!
[comments about systemd removed... it's a complex and evolving
subject...]
10/30/14
Bug #1382251 seems to be similar to the dosemu bug - "Kernel
update
breaks Picasa". Picasa (actually no longer supported) is run
using an integrated version of wine, and probably contains 16-bit
code. Indeed, no 16-bit app (dos or Windows) works after the
3.2.0-70 kernel update.. tried the old reversi.exe from Windows
3.1 and it comes up with the same error messages as with the
Picasa bug. Worse, after an nvidia driver update, I can no longer
boot into the previous kernel with full-resolution graphics (was
going to verify that [the old Windows] reversi still worked -
pretty sure it used to). Tried apt-get install --reinstall
nvidia-current in recovery mode, no effect - probably an easy
solution but previously booting into the previous kernel worked
fine, now it doesn't. Apparently the new driver removed itself
from previous kernel versions and only installed itself on the
latest version, effectively (unless I can figure it out)
preventing me from running previous kernels unless I want to
suffer 800x600 resolution. I'm not amused.. will figure out how to
fix it later (running previous kernel versions is an important
safety net in case more serious bugs arise), but sure do have
better things to do than figuring out how to configure graphics
and kernel modules from a command line (while avoiding booting
into my environment graphically lest it mess up my icon
arrangement - things are where I want them).
But back to bug #1382251 - apparently the issue is kernel compile
option CONFIG_X86_16BIT is not set - it says that in a comment in
the /boot/config-3.2.0-70-generic file. Notes regarding the
upstream 3.2.63 kernel indicate the kernel must be configured to
allow running 16-bit extended memory code at all (to make the
kernel more compact for embedded apps).
[update 11/25/14 - Ubuntu kernel 3.2.0-72 fixes the configuration
issue - dos dpmi programs now work under dosemu. Cool! I don't use
dpmi stuff much but it's nice to run a few laps around the NFS
track every now and then...]
11/19/14
The heck with bugs.. let's have fun. Can't figure something out? Maybe see what this program says about it...
#!/usr/local/bin/blassic 10 rem yesno.bas - 11/19/14 WTN 100 dim a$(6) 'answer list... modify as desired... 101 a$(1)="Yes." 102 a$(2)="Probably." 103 a$(3)="Maybe." 104 a$(4)="Probably not." 105 a$(5)="No." 106 a$(6)="Don't know." 160 n=6 'must match number of answers 170 randomize 180 cls 190 print "This program answers yes/no questions." 220 print "Press just enter to exit." 280 print 290 print "Enter your question..." 295 u=int(rnd(1)*n) 300 f=1:print "_";chr$(8); 310 if u<n then u=u+1 else u=1 315 shell "sleep 0.02" 'linux specific! for other OS remove 316 'or replace with some other kind of millisecond delay 320 k$=inkey$:if k$="" then goto 310 330 if (k$=chr$(13) or k$=chr$(10)) and f=1 then goto 360 332 if k$<>chr$(8) and k$<>chr$(127) then goto 340 334 print chr$(8);"_ ";chr$(8);chr$(8);:goto 310 340 if k$=chr$(13) or k$=chr$(10) then goto 350 344 f=0:print k$;"_";chr$(8);:goto 310 350 print " ":print a$(u):goto 290 360 system
This is written in Blassic, a
handy BASIC interpreter that can run in #!-style script mode. I
use version 0.10.2 (local copy here)
- to compile make sure g++ and libncurses5-dev are installed (may
need other dependencies depending on what configure says - ignore
the bit about not finding x, that's only needed for graphics),
extract it then in the extracted directory do ./configure followed
by make - if successful either copy the blassic binary to
/usr/local/bin or do sudo make install.
Sample run...
This program answers yes/no questions.
Press just enter to exit.
Enter your question...
Should I just accept systemd and move on?
Don't know.
Enter your question...
What about MATE 14.04?
Yes.
Enter your question...
_
The underscore is a simulated cursor, only backspace editing is
supported. The actual contents of the question is not considered,
rather the program determines the response using a combination of
a psuedo-random function which is further randomized by a
free-running cycling counter so that the output is
non-deterministic. Some programs like this use only the
psuedo-random function but that returns a (very long) fixed
sequence, but that would pre-determine the responses based on the
exact moment the program is launched. That just doesn't seem
right. With the counter, the response is ultimately determined by
the moment the enter key is pressed, thus you (along with the
quantum continuum or whatever you're into) determines what it
says.
Blassic is cool but it's a hassle to compile and isn't supported much (if at all) anymore, so I rewrote the program in bash...
#!/bin/bash # yesno.sh - 11/19/14 WTN if [ ! -t 0 ]; then exit; fi # only run in a terminal stty -echo -icanon time 0 min 0 # enable non-blocking input sleep 0.2 # wait to take effect # answer list... modify as desired... a[1]="No." a[2]="Probably not." a[3]="Maybe." a[4]="Don't know." a[5]="Probably." a[6]="Yes." answers=6 # must match number of answers empty=1 # flag to indicate empty line quit=0 # flag to indicate quit condition (enter on empty line) counter=$RANDOM # counter increments 1-6 to determine answer # RANDOM returns 0-32767 but will be reduced to 1-6 by incrementer echo echo "This script answers yes/no questions." echo "Press just enter to exit." echo echo "Enter your question..." while [ $quit -eq 0 ];do # loop until quit gets set sleep 0.02 # delay a bit so it won't consume all CPU time # increment/cycle counter... counter=$((counter % answers)) # reduce counter to 0 to answers-1 counter=$((counter + 1)) # add 1 to counter so always 1-answers # get raw keystroke... term must be in non-blocking mode key=$(head -c 1;echo x) key=${key%?} # strip the x from key string # without the echo x and subsequent removal it doesn't pick up enter if [ "$key" != "" ];then # if a key was pressed if [ "$key" == $'\x0a' ];then # if enter was pressed echo # echo the newline if [ $empty -eq 1 ]; then # if empty line then quit quit=1 else echo "${a[$counter]}" # otherwise print answer echo "Enter another question..." # prompt again empty=1 # set empty flag counter=$RANDOM # randomize counter again fi else # if using || (or) then has to be [[ .. ]] kind of test if [[ "$key" == $'\x7f' || "$key" == $'\x08' ]]; then echo -ne '\x08 \x08' # backspace (responds to either kind) else empty=0 # reset empty flag to indicate something was typed echo -n "$key" # print the key that was just pressed fi fi fi done stty sane # restore terminal
This is a lot easier to run under most versions of Linux...
simply copy it to a text file, make it executable and run it in a
terminal. Tested in Ubuntu 12.04 and 14.04 and Debian 6 and 7. The
overall operation is similar to the Blassic version but the logic
is implemented differently. The bash version is structured (there
is no goto in the bash language), but that complicates the logic -
in BASIC one simply jumps where needed but in goto-less languages
one has to control the flow using the provided structures. Not
that there's anything wrong with that, it just makes trivial code
less trivial due to the extra code needed to get from one part of
the code to another. The flip side is if a large program were
written using unstructured goto flow it would likely be
incomprehensible.
Getting the equivalent of BASIC's INKEY$ function in bash was
tricky - the terminal blocks until a key is pressed so in order to
have a free-running counter I had to put the terminal in
non-blocking mode. Got the idea from
here, that example used the dd command but I rewote it to
use a simple head command instead. The output of a subshell
removes the trailing linefeed so to pick up the enter key have to
append a junk character then remove it.
11/23/14
Here's a python (2.6 or 2.7) version of the yesno program...
#!/usr/bin/python # yesno.py - 11/22/14 WTN import random,time answers = ( "No.", "Probably not.", "Maybe.", "Probably.", "Yes.", "Don't know." ) random.seed() print print "This program answers yes/no questions." print "Press just enter to exit." print while (True): question = raw_input("Enter your question...\n") if len(question) > 0: modtime = (time.time()%10000.0)*100.0 selection = int(modtime*random.random())%len(answers) response = answers[selection] print response else: break
This version is quite compact because it doesn't attempt to run a
counter with non-blocking keystroke input (which in python doesn't
seem to be exactly trivial), rather it adds entropy by checking
the time when the enter key is pressed. More than one way to skin
Schrödinger's cat. The time.time() function returns a float
representing how many seconds since epoch (whenever that was),
this is mod'd by 10000 to reduce the range and avoid potentially
losing precision then multiplied by 100 so that the integer
portion changes every 10ms. The selection is made by taking that
value, multiplying by the float returned by random.random() then
mod'ing by the number of answers. The list of answers can be
trivially modified, thanks to the len function nothing else needs
changing for any number of responses.
Python is pretty cool but there are issues. Biggest issue is the
language is somewhat of a moving target.. python 3 is incompatible
with python 2, and although python 2.7 is supposed to be supported
for a long time, how long it will remain easily available is
uncertain. Until "at least 2020" isn't really good enough. Need
something for creating scripted programs that's at least as stable
as bash but easier to write and read. Preferably that doesn't rely
on an obscure interpreter that may not be buildable or even
available in the future.
C has feature-creap incompatibility issues
too, but mostly in extra libraries and fancy stuff - the core
language is stable. At least the bits of it I'd be using for the
simple kinds of programs I typically write. One obvious solution
is a simple wrapper that permits C source to be "executed"
directly...
#!/bin/bash # cwrapper - 11/23/14 WTN # this script lets simple C programs to be written like scripts # by specifying this script in an initial #! line if [ -e "$1" ];then # if script exists ctmp=$HOME/.cwrapper_tmp # temp dir for building cname=`basename "$1"` # derive base name of script mkdir -p "$ctmp" # make sure it exists tail -n +2 < "$1" > "$ctmp/$cname.c" # copy C source to temp file shift # shift all parms down 1 # compile the source with gcc and if no errors run it with parms gcc -o "$ctmp/$cname.bin" "$ctmp/$cname.c" && "$ctmp/$cname.bin" "$@" if [ -e "$ctmp/$cname.c" ];then rm "$ctmp/$cname.c";fi # remove temps if [ -e "$ctmp/$cname.bin" ];then rm "$ctmp/$cname.bin";fi fi
Ha.. crude hack but I like it, copy to /usr/local/bin then can
write scripts like...
#!/usr/local/bin/cwrapper #include<stdio.h> main() { printf("Hello World\n"); }
Another way to do it is to embed similar code with the C code to
make it a stand-alone script program...
#!/bin/bash # cwrapper_embedded - 11/23/14 WTN (this wrapper is public domain) # this code lets simple C programs to be written like scripts # by copying this code in front of the C code. cstart=16 # line number where the C code starts ctmp=$HOME/.cwrapper_tmp # temp dir for building cname=`basename "$0"` # derive base name of script mkdir -p "$ctmp" # make sure it exists tail -n +$cstart < "$0" > "$ctmp/$cname.c" # copy C source to temp file # compile the source with gcc and if no errors run it with parms gcc -o "$ctmp/$cname.bin" "$ctmp/$cname.c" && "$ctmp/$cname.bin" "$@" if [ -e "$ctmp/$cname.c" ];then rm "$ctmp/$cname.c";fi # remove temps if [ -e "$ctmp/$cname.bin" ];then rm "$ctmp/$cname.bin";fi exit # ===== end wrapper code, C code follows ====== #include<stdio.h> main() { printf("Hello World\n"); }
A problem with that approach is it messes up the C syntax
highlighting when editing, but it's easy enough to reselect C
mode.
There's an official way to do something like this called binfmtc
(can be installed from repository), the main differences (besides
being faster and more sophisticated) instead of a #! start line,
have to make the first line /*BINFMTC: with */ on the next line,
and it's pickier about the source code - my crude version doesn't
complain about lack of a main type and return statement, and it
only shows actual errors (no -W option in the gcc line).
11/24/14 [edits 12/1/14, 12/22/14]
Of course C as a scripting language
kinda sorta sucks (almost everything seems hard to do), but this
same general technique can be used to turn just about any compiled
language into a scripted language. I've been playing around with a
BASIC-to-C
converter/compiler called BaCon and it's really cool. 64-bit
deb and rpm packages are available, or build from source. BaCon
and its GUI are written in the BaCon language (self-hosted) with a
clever bootstrap mechanism - the compiler is also
parallel-implemented as a huge multi-platform shell script which
is used to compile the compiler. Should work on most Unix-like
operating systems that have bash, make and gcc installed (or
compatible alternatives) plus a few standard utilities.
The language itself is mostly stock BASIC with a few differences
including...
Doesn't have INKEY$ but has WAIT which works great for
non-blocking non-echoing key detection. Also has a blocking
GETKEY.
DECLARE is used to declare arrays, subscripts are bracketed and
C-style.. DECLARE a[10][10] instead of DIM a(10,10).
There is NO bounds-checking on arrays (just like C), be careful.
Use OPTION BASE 1 for normal BASIC behavior.
RND returns a large integer between 0 and the constant MAXRANDOM,
rather than a float between 0 and 1.
MOD is a function rather than an operator.
Some variable names (like y1 or anything else reserved in C and
its libraries) can't be used. It'll complain about redefining.
...but it's all well-documented and not nearly as different (at
least for the kind of code I typically write) as FreeBasic's
native language, it's more like QBasic with a few different words.
Many of BaCon's differences are because it converts the code to C
so expressions are often C-like to keep it simple. BaCon provides
the power and speed of C without having to suffer the complication
of writing C code.
One thing I really like about BaCon is the compiler source is a
single BaCon file (bacon.bac) that can be easily edited and
recompiled as needed. When I first started messing around with
version 3.0.1, I ran into an issue with reporting error messages
(fixed in the 3.0.2 version).. quickly hacked a temporary fix
despite being unfamiliar with the code. Ran into a minor issue
running under CygWin (a gcc warning about not needing the -fPIC
flag).. another trivial fix (or ignore, the resulting bacon.exe
and library worked fine and normal usage was not affected). Being
able to fix or change stuff myself is very nice!
Anyway... to turn it into "baconscript" I made this bash script... [updated 11/26/14 to handle INCLUDE]
#!/bin/bash # baconscript - 11/26/14 WTN # permits running BaCon programs like scripts using a #! line # note.. script base name must not contain spaces if [ -e "$1" ];then bname=`basename "$1"` # derive base name of script btmp="$HOME/.baconscript_tmp/$bname" # temp dir for building if [ -e "$btmp" ];then rm -r "$btmp";fi # remove old temps mkdir -p "$btmp" # create empty temp directory tail -n +2 < "$1" > "$btmp/$bname.bac" # copy bacon code to temp dir # also copy any included files.. grep "INCLUDE " "$1"|cut -d'"' -f2|while read f;do if [ -e "$f" ];then cp "$f" "$btmp/";fi;done shift # move all parms down 1 bacon -d "$btmp" "$btmp/$bname.bac">/dev/null # compile bacon code if [ -e "$btmp/$bname" ];then # if compile successful "$btmp/$bname" "$@" # execute the program with command line parms rm -r "$btmp" # remove the temp directory and temp files else # compile not successful... if [ -e "$btmp/$bname.bac.log" ];then # if error log exists echo;echo "===== C compiler error log =====" cat "$btmp/$bname.bac.log" # show the log fi fi # if not successful leave temps in place for debugging fi
...and copied it to a file named "/usr/local/bin/baconscript",
now BaCon programs can be turned into scripts by adding
#!/usr/local/bin/baconscript to the first line and make it
executable. One big advantage of scripted code is on most Linux
distros by default scripts can be double-clicked and the system
asks if to run in a terminal. Usually if a binary is
double-clicked it does not launch a terminal which of course won't
work for a console program.. to run the compiled binary have to
drop to a terminal first. Turning programs into scripts saves
time.
Here's a sample program showing some BaCon features I find
useful...
#!/usr/local/bin/baconscript a$ = ARGUMENT$ z = INSTR(a$," ") IF z>0 THEN PRINT "Program arguments: ";MID$(a$,z+1) END IF PRINT "Enter something: "; INPUT a$ IF a$ <> "" THEN PRINT "You entered: ";a$ END IF PRINT "Press a key: "; toggle = 0 REPEAT keycode = WAIT(0,200) IF keycode = 0 THEN IF toggle = 0 THEN PRINT "*";CHR$(8); toggle = 1 ELSE PRINT " ";CHR$(8); toggle = 0 END IF END IF UNTIL keycode > 0 PRINT " " PRINT "You pressed: ";CHR$(keycode) PRINT "Press any key to exit..." keycode = GETKEY END
Although some parts of the language are BaCon-specific and only
work for unix/linux, simpler programs can be written so they'll
work under just about any BASIC, or at least easy to convert -
most of the time it's the algorithm itself I want to preserve, and
BASIC permits algorithms to be expressed cleanly without a lot of
programming-language baggage. In my opinion - maybe just because
I'm used to it.. but what I'm used to is very important when I
need a one-off program quickly - in such cases I usually need an
answer fast and don't want to spend extra time figuring out
syntax.
Here's a version of the yesno program for BaCon(script)...
#!/usr/local/bin/baconscript ' yesno.bs - WTN 11/25/14, 12/22/14 OPTION BASE 1 DECLARE a$[6] a$[1] = "No." a$[2] = "Probably not." a$[3] = "Maybe." a$[4] = "Probably." a$[5] = "Yes." a$[6] = "Don't know." answers = 6 :'number of answers PRINT PRINT "This program answers yes/no questions." PRINT "Press just enter to exit." PRINT counter = RND exitprogram = 0 WHILE exitprogram = 0 empty = 1 PRINT "Enter your question..." entered = 0 WHILE entered = 0 REPEAT keycode = WAIT(0,20) counter = MOD(counter,answers) counter = counter + 1 UNTIL keycode > 0 IF keycode = 10 THEN entered = 1 IF empty = 1 THEN exitprogram = 1 ELSE PRINT PRINT a$[counter] counter = RND END IF ELSE IF keycode = 127 OR keycode = 8 THEN PRINT CHR$(8);" ";CHR$(8); ELSE empty = 0 PRINT CHR$(keycode); END IF END IF WEND WEND END
NOW returns a large integer representing the number of seconds
since 1970, used to SEED the random number generator. RND returns
a (large) integer between 0 and the reserved constant MAXRANDOM,
the counter variable is reduced to the correct range using the MOD
function so for this app, the size of the number does not matter.
Three flags control the program logic.. quit, empty and entered -
empty is used to detect entering an empty line so it can set the
quit flag, the entered flag detects when an answer was printed so
it can print the prompt again - probably could be coded better but
was going for simplicity. The counter cycles between 1 and the
number of answers, incrementing or reseting every 20 milliseconds
while waiting for a keypress. As far as I can tell on all
platforms the enter key returns a single code 10, the backspace
key returns 127 or 8 depending on the terminal settings.
11/28/14 - OK I had to try it... BaCon under Windows.... it
works!!! Tested under Windows 7 (32 bit, running under Virtual
Box), requires a recent CygWin with make and gcc installed. Copied
bacon.bac and bacon.sh to a directory under C:\cygwin (under my
home dir for convenience), then in that directory under the cygwin
bash shell ran ./bacon.sh bacon.bac - took a long time (cygwin's
bash is fairly slow) but eventually completed, but with a (non)
error - a warning about -fPIC not being needed for the target
platform.. however it produced bacon.exe and libbacon.a which I
copied to cygwin's /bin/ and /lib/ directories. That option is
only used when building a library (recompiling bacon itself) and
has no effect on normal compiling.. and was trivial to fix if one
cares, just add IF INSTR(OS$,"CYGWIN) < 1 THEN before the code
that adds the -fPIC option (in 3 places)... I love it [how easy it
is to adapt the code...but some versions of cygwin/gcc may still
need that flag]. After "installing" into the cygwin environment
can do bacon program.bac to produce an exe - although cygwin is
needed to compile, the resulting console program runs fine under
regular Windows, all it needs is the cygwin dll runtime (as with
any other cygwin gcc-produced program). Cool... now I have more
options.
12/8/14 - MATE 14.04
running in VirtualBox after a bit of playing around...
That's a modification of the Dust theme (from
gnome-themes-ubuntu), with Ambiance borders and Ubuntu-Mono-Dark
icons. Dust doesn't provide a GTK3 theme, but all I had to do to
fix that was to drag the gtk3 folder from Ambiance (or whatever)
to the Dust folder, then apps like synaptic won't run with ugly
defaults. At first I tried using a modification of the default
Ambient-MATE theme but that causes some dialogs [including the
menu editor and the dialog for creating desktop shortcuts] to show
white text on a white background [at least for this virtual
install].. so installed a bunch of themes and theme engines from
the repository and found something that works. MATE itself is
GTK2, and pretty much a clone of the way Gnome was before Gnome 3
came along, just the names of the components have changed to avoid
conflicts. Some paths are different based on the new names, for
example file manager scripts go in ~/.config/caja/scripts. No docs
got installed and the mate-user-guide package isn't available from
the official repositories, so found a .deb for the mint version on
the UbuntuUpdates
site. Also had to do that with Assogiate and its libgnome-vfsmm
dependency - being able to create/edit my own file (mime) types is
an absolute must for me. Installing "out of band" packages isn't
usually recommended, can potentially break things, but in these
cases it worked fine. As usual for a new OS had to install all
sorts of stuff.. make dkms and gcc so the VB guest additions would
install, gdebi and synaptic for installing and maintaining
packages, utilities like htop and xterm, themes and theme tools,
etc.. the virtual install is a test run to make sure the important
stuff works before investing effort into setting everything up on
real hardware.
I like it! It's like Gnome 2 was before it got "improved" - it has an Appearance applet that can save new themes based on controls, icons and window decorations from existing themes, built-in right-click options for creating desktop icons, launching a terminal and browsing files as root, an associations dialog that has the option to run with a custom command, a panel that lets me put stuff where I want.. those little things I used all the time until they got removed. I hope this becomes an official Ubuntu flavor, but official or not it's still tied to the current 14.04 infrastructure, should be good until 2019.
12/22/14 - Apart from the GUI, some system-related procedures are a bit different with 14.04 but that's to be expected. I still use many 32-bit apps in binary form (pmars/pmarsv, old stuff, anything made using FreeBasic), to enable 32-bit support I issued the equivalent of the following commands...
sudo dpkg --add-architecture i386
sudo apt-get update
sudo apt-get install libc6:i386 libc++6:i386 libncurses5:i386 libx11-6:i386
...that's should be enough to get most simple 32-bit binaries
running. There used to be the ia32-libs[-multiarch] package that
installed common 32-bit libraries, but it's no longer available
(and most aren't needed anyway). Rather just install the libraries
that are actually needed. Use the ldd command to find out what
libraries are linked by a binary, if something isn't found it's
marked as missing. Remember to run the ldconfig command after
manually copying libraries into /usr/lib so that binaries can find
them.
MATE 14.04 looks like it can work for me.. so far no problems
running blassic, bacon, dosemu, freebasic, scripts, etc, and for
creating custom file types Assogiate works even though I have to
install an older version (the Rox MIME editor also still works).
But it's not something I'll be able to simply switch to... there
are many more things that need testing, and my current 12.04
system has had years of customization, much of which would have to
be duplicated. Even if a new OS is perfect (although none are),
it's still a pain to uproot and switch. Nevertheless it's
comforting to know there's an alternative to all the "new stuff"
going around.
7/26/16
Haven't made an entry in a while... not much has changed with my
system. Still running Ubuntu 12.04, for now it works fine. Right
now the only consistent bug is recent versions of FireFox don't
always update the screen after keystrokes - Chrome doesn't do that
so that's what I run if I have to type a lot. Ha! after googling I
fixed that - in about:config set
layers.offmainthreadcomposition.enabled to false, apparently it's
a new feature that was added at v33 that isn't compatible with
some video systems. Very occasionally when cold the system fails
to boot to the GUI and I get dumped to a console, doing a
control-alt-delete to reboot fixes that (or typing startx if in a
hurry). Sounds like a video card issue, it's been that way for a
long time. Very rarely my self-compiled "indicator-sensors"
cpu/video temperature widget fails to initialize so have an icon
that runs "gnome-panel --replace" to click when that happens. Hard
to find much to complain about - that's why I'm dreading the
upgrade. I had Ubuntu 14.04 installed in another partition for
awhile, thinking I'd move to it, but when I boot into it it's
depressing - my world is gone and getting it all back is not
trivial, it took years to get this stuff set up the way I want. I
wish Ubuntu would go to a rolling release model and do it right
(from a user's point of view) - if something is incompatible or no
longer available don't just remove it, just don't install whatever
it conflicts with and let me decide if I want to fix it. I don't
mind (some) change or shiny new things (if they work), but I very
much mind changes that make the things I depend on no longer work.
Anyway... here are a few tidbits to share...
Fan speed control has always been a semi-issue on my system,
sometimes heavy-load tasks "forget" to speed up the fan
(motherboard issues). Seems better now since the k10temp driver
became integrated into the kernel but still sometimes whatever
speed it thinks it needs I want more. So made a simple circuit
that connects in-line with the CPU fan...
pwm O---blue---*--22K--. .-----O
tach O---yel----|-------|--------------------|-----O to
12V O---red----|---*---*--------------------|-----O fan
gnd O---blk--*-|---|------------------------|-----O
| | | speed control |
| | 2.2K max |
|4.7K | .-----> 1M(AT) |
| | | | |min |c
| | *--*---|<|--*--22K--*--|< 2N3904
| | |c 1N4148 | | |e
| `-|< 2N3904 1000p 100K |
| |e | | |
`-----*-----------*-------*----'
All it does is stretch the PWM signal - when the control is at
minimum the fan runs at normal speed, as the control is advanced
it adds extra time to each high pulse, at maximum the fan runs
full speed regardless of the PWM input signal.
The recently used list in gnome panel and other apps is handy,
but lots of times I'll move stuff around or delete stuff and it
isn't smart enough to know about that, resulting in invalid
entries. So wrote a blassic script... [updated 8/24/16]
------- begin remove_recent.blassic ---------------------------------------
#!/usr/local/bin/blassic rem remove_recent.blassic - 7/12/2016 WTN rem 8/24/16 - remove duplicate entries too rem This blassic/bash program scans the recently-used.xbel file and rem deletes entries that no longer exist. Understands %20 for space rem but other % codes not supported, these will be removed. rem note - redundant removal feature requires the realpath utility. rem this script must be marked executable, make sure the rem initial #! line points to the blassic binary. on error goto errorhandler dim fname$(10000) nextfname=1 deletedcount=0 temp$="/tmp/_remove_tempfile_" helper$="/tmp/_remove_temp_helper_" result$="/tmp/_remove_temp_result_" shell "2>/dev/null ls /usr/bin/realpath > "+temp$ shell "echo >> "+temp$ open temp$ for input as #1 input #1,a$ close #1 if a$<>"/usr/bin/realpath" then goto norealpathutility print "Removing non-existent and duplicate recent file entries..." goto startprocessing label norealpathutility: print "Removing non-existent recent file entries..." print "(install realpath for duplicate entry feature)" label startprocessing: open helper$ for output as #2 print #2,"#!/bin/bash" print #2,"fn=`echo ";chr$(34);"$1";chr$(34);"|sed -e 's/%20/ /g'`" print #2,"if [ -e ";chr$(34);"$fn";chr$(34);" ];then echo yes" print #2,"if [ -e /usr/bin/realpath ]; then realpath ";chr$(34);"$fn";chr$(34) print #2,"else echo ";chr$(34);"$fn";chr$(34);";fi" print #2,"else echo no;echo;fi" close #2 shell "chmod +x "+helper$ shell "echo $HOME > "+temp$ open temp$ for input as #1 input #1,home$ close #1 kill temp$ open home$+"/.local/share/recently-used.xbel" for input as #1 open temp$ for output as #2 while not eof(1) line input #1,a$ b$=ltrim$(a$) if left$(b$,15)<>"<bookmark href=" then goto writeline c$=mid$(b$,24) q=instr(c$,chr$(34)) c$=left$(c$,q-1) shell helper$+" "+chr$(34)+c$+chr$(34)+">"+result$+" 2>/dev/null" open result$ for input as #3 input #3,d$ line input #3,e$ close #3 if d$<>"yes" then goto skipdupcheck for f = 1 to nextfname if fname$(f)=e$ then goto skiplines next f fname$(nextfname)=e$ nextfname=nextfname+1 label skipdupcheck: if d$<>"no" then goto writeline label skiplines: line input #1,a$ b$=ltrim$(a$) if left$(b$,11)<>"</bookmark>" then goto skiplines deletedcount=deletedcount+1 goto continuelooping label writeline: print #2,a$ label continuelooping: wend close #1 close #2 shell "rm $HOME/.local/share/recently-used.xbel*":rem extra copies too shell "cp "+temp$+" $HOME/.local/share/recently-used.xbel" kill temp$ kill result$ kill helper$ if deletedcount=0 then print "No entries removed." if deletedcount>0 then print "Removed ";deletedcount; if deletedcount=1 then print " entry." if deletedcount>1 then print " entries." goto exitscript label errorhandler: print "An error occured: code ";err;" at line ";erl/10+1 label exitscript: shell "sleep 3" system
------- end remove_recent.blassic -----------------------------------------
Some fine spaghetti-code there... but I don't mind [it's very
similar to the kinds of BASICs I grew up on, at least it doesn't
require line numbers]. A handy command in the bash part is a sed
line that does search and replace - sed -e 's/old/new/g' - echo a
variable into it then capture the output into another variable, in
this case fn=`echo "$1" | sed -e 's/%20/ /g` to change the %20's
in the XML file to spaces so that filenames with spaces in them
will be properly processed. [updated 8/24/16] The script now also
removes redundant duplicate entries that can happen when files are
opened from symlinked directories, to do this it uses the realpath
utility which returns the true filename of a file. This usually
isn't installed by default so it checks first. Also added
variables for the temp files and better error handling, if an
error occurs the temp files are not removed to help track down the
problem.
For the last few months I've been recording the writers nights at
the club where I run sound - recorded on my Tascam DR-40 then
processed using Ardour/JAMin/TimeMachine/Audacity to create an MP3
file for each round, followed with EasyTag to add the title and
artist metadata. Usually there are two artists on each half-hour
set. I also rename the files to include the artist names but after
over 200 set files with over 100 artists finding things manually
was getting very tedious. So wrote these bash and blassic
scripts...
------- file makemp3index.sh --------------------
#!/bin/bash # requires soxi and blassic # makemp3indexhelper.blassic must be in current directory # original 5/9/16 mod 6/13/16 to add length/size to filename outfile="masterlist_by_file.txt" echo "Generating $outfile" if [ -e "$outfile" ];then rm "$outfile" fi find | grep ".mp3$" | sort | sed "s/^.\///" | while read f do artist=`soxi "$f" | grep "^Artist="` filesize=`soxi "$f" | grep "^File Size :" | tail -c +18` duration=`soxi "$f" | grep "^Duration :" | tail -c +18 | head -c 8` if echo $duration | grep -q "^00:"; then duration=`echo $duration | tail -c +4` fi if [ "$artist" != "" ];then echo "File=$f ($duration,$filesize)" >> "$outfile" echo "$artist" >> "$outfile" fi done blassic makemp3indexhelper.blassic echo "Done" sleep 5
------- end makemp3index.sh ---------------------
------- file makemp3indexhelper.blassic ---------
rem File makemp3indexhelper.blassic, called by makemp3index.sh rem Made by Terry Newton (WTN) rem Original version written May 9 2016 to process a collection of MP3 rem recordings so I can tell at a glance who played when and on which file rem Slightly modified May 30 2016 to add a line before the artist names rem Modified June 12 2016 to separate (duration,filesize) if present rem rem This is blassic code (a type of BASIC), requires blassic and sort. rem I like blassic because it's extremely fast and avoids compiling, rem it does require using a lot of goto statements due to its lack of rem block if/then/else but for quick and dirty stuff like this it's great. rem If blassic isn't available can be easily converted to another BASIC rem like FreeBasic, just remove "label " from the labels. I run this rem under Linux but should work under Windows with some mods, mainly rem to rewrite the makemp3index shell script in batch or use Cygwin etc. rem Blassic and sox are available for Windows, sort should work the same. rem To get soxi for parsing tags copy "sox.exe" and name it "soxi.exe". rem rem This program takes an input file in the form of... rem File=filename.mp3 rem Artist=artist name[ and another artist name[ and ...]...] rem ...repeated for every MP3 file then outputs a sorted list rem of artists along with the files they are featured on. rem Example input file... rem ----------------------------------------------------------------- rem File=Writers_Night_151203_set1_Jamie_Wayz_Jim_Martin.mp3 (22:33,54.1M) rem Artist=Jamie Wayz and Jim Martin rem File=Writers_Night_151203_set2_Ariel_Petrie_David_Dale_King.mp3 (25:42,61.7M) rem Artist=Ariel Petrie and David Dale King rem ----------------------------------------------------------------- rem (duration,filesize) is optional rem This produces the following in the output file... rem ----------------------------------------------------------------- rem Ariel Petrie rem Writers_Night_151203_set2_Ariel_Petrie_David_Dale_King.mp3 (25:42,61.7M) rem rem David Dale King rem Writers_Night_151203_set2_Ariel_Petrie_David_Dale_King.mp3 (25:42,61.7M) rem rem Jamie Wayz rem Writers_Night_151203_set1_Jamie_Wayz_Jim_Martin.mp3 (22:33,54.1M) rem rem Jim Martin rem Writers_Night_151203_set1_Jamie_Wayz_Jim_Martin.mp3 (22:33,54.1M) rem ----------------------------------------------------------------- rem Also produces a simple html file with the same content, but with rem links to the files so they can be played in a browser. rem rem The input file is generated by the makemp3index script, essentially... rem echo "File=$f">>$out;soxi "$f"|grep "Artist=">>$out (requires soxi) rem ...called in a loop for each MP3 file ($f=mp3 file, $out=output file) rem The input file must not contain any other data or extra line ends. rem rem To avoid separating names that happen to contain " and " this program rem also checks to see if an and exception list exists, put any names that rem shouldn't be separated in this file, must match exactly. rem inputfile$ = "masterlist_by_file.txt" outputfile$ = "masterlist_by_artist.txt" htmloutputfile$ = "masterlist.html" exceptionfile$ = "andexceptions.txt" arraylimit = 1000 : rem increase this if more than 1000 files/artists dim artist$(arraylimit,arraylimit) rem artist$(n,1) = artist name artist$(n,2) and above = file names dim andexception$(arraylimit) print "Processing ";inputfile$ numberofartists = 0 numberoffiles = 0 numberofexceptions = 0 on error goto noexceptionlist open exceptionfile$ for input as #1 print "Reading and exceptions from ";exceptionfile$ while not eof(1) line input #1, t$ t$ = ltrim$(rtrim$(t$)) if t$ = "" then goto skipexception numberofexceptions = numberofexceptions + 1 andexception$(numberofexceptions) = t$ label skipexception: wend print "Number of exceptions = ";numberofexceptions label noexceptionlist: close #1 on error goto inputfilemissing open inputfile$ for input as #1 on error goto programerror while not eof(1) line input #1, a$ line input #1, b$ if left$(a$,5) <> "File=" then goto fileerror if left$(b$,7) <> "Artist=" then goto fileerror numberoffiles = numberoffiles + 1 a$=mid$(a$,6) b$=mid$(b$,8) b$ = ltrim$(rtrim$(b$)):rem in case extra spaces in tag q=instr(b$," and ") if numberofexceptions = 0 or q = 0 then goto noexceptions1 for k = 1 to numberofexceptions elen = len(andexception$(k)) if left$(b$,elen) = andexception$(k) then q = instr(elen,b$," and ") next k label noexceptions1: if q > 0 then goto processmultiple label addartist: rem see if unique if numberofartists = 0 then goto newartist n = 0 for i = 1 to numberofartists if artist$(i,1) = b$ then n = i: i = numberofartists next i if n = 0 then goto newartist for i = 2 to arraylimit if artist$(n,i) <> "" and i = arraylimit then goto limiterror if artist$(n,i) = "" then artist$(n,i) = a$: i = arraylimit next i goto keepprocessing label newartist: numberofartists = numberofartists + 1 if numberofartists > arraylimit then goto limiterror artist$(numberofartists,1) = b$ artist$(numberofartists,2) = a$ goto keepprocessing label processmultiple: while q<>0 t$ = left$(b$, q-1) b$ = mid$(b$,q+5) n = 0 for i = 1 to numberofartists if artist$(i,1) = t$ then n = i: i = numberofartists next i if n = 0 then goto multiplenewartist for i = 2 to arraylimit if artist$(n,i) = "" then artist$(n,i) = a$: i = arraylimit next i goto keepprocessingmultiple label multiplenewartist: numberofartists = numberofartists + 1 if numberofartists > arraylimit then goto limiterror artist$(numberofartists,1) = t$ artist$(numberofartists,2) = a$ label keepprocessingmultiple: q = instr(b$," and ") if numberofexceptions = 0 or q = 0 then goto noexceptions2 for k = 1 to numberofexceptions elen = len(andexception$(k)) if left$(b$,elen) = andexception$(k) then q = instr(elen,b$," and ") next k label noexceptions2: wend goto addartist:rem loop back to add last item label keepprocessing: wend close #1 if numberofartists = 0 then goto fileerror print "Number of files = ";numberoffiles print "Number of artists = ";numberofartists print "Generating ";outputfile$;" and ";htmloutputfile$ open "__temp1" for output as #1 for i = 1 to numberofartists print #1,artist$(i,1) next i close #1 shell "sort __temp1 > __temp2" open "__temp2" for input as #1 open outputfile$ for output as #2 open htmloutputfile$ for output as #3 print #2,"================================" print #2,"Total of ";numberofartists;" unique artist names" print #2,"Total of ";numberoffiles;" MP3 files" print #2,"================================" print #3,"<html><h3>Index of MP3 files by artist name</h3>" print #3,"<p>Total of ";numberofartists;" unique names in "; print #3,numberoffiles;" files.</p>" while not eof(1) line input #1, t$ n = 0 for i = 1 to numberofartists if artist$(i,1) = t$ then n = i: i = numberofartists next i if n = 0 then goto sorterror print #2,"" print #2, t$ print #3, "<p>";t$ for i = 2 to arraylimit if artist$(n,i) = "" then i = arraylimit : goto finisharrayprintloop print #2," ";artist$(n,i) print #3,"<br> <a href=";chr$(34); c$ = artist$(n,i) : d$ = "" if right$(c$,1) <> ")" then goto finishtag q = instr(c$,".mp3 (") : if q=0 then goto finishtag d$ = mid$(c$,q+4) : c$ = left$(c$,q+3) label finishtag: print #3,c$;chr$(34);">";c$;"</a>";d$ label finisharrayprintloop: next i print #3,"</p>" wend print #2,"" print #3,"</html>" kill "__temp1" kill "__temp2" goto exitprogram label fileerror: print "input file not formatted correctly" goto exitprogram label inputfilemissing: print "no input file" goto exitprogram label programerror: print "an error occured",err,erl goto exitprogram label limiterror: print "array limit exceeded, edit program to increase" goto exitprogram label sorterror: print "system error with sort" print "see __temp1 and __temp2" label exitprogram: close #1 close #2 close #3 system
------- end makemp3indexhelper.blassic ----------
...now I can tell quickly who played when and on what set. More
spaghetti magic.. a real programmer (for some definitions of real)
would freak out on goto-heavy code like that but having grown up
with 8-bit BASIC on the C64 COCO Atari etc it doesn't bother me -
sure I'd rather have true block if then else but blassic doesn't
have that and it's not a big enough deal to resort to compilers or
try to figure out how to do all that string processing in another
language. One day I'll learn python well enough to pull off stuff
like this (when it stops changing).. but I don't care about
programming, I care about the answers.
11/20/16
The KiCad PCB package sure has advanced since the last time I
messed with it, especially "nice" things like the 3D viewer. A
test I made in 2011...
A PCB I recently completed using version 4.0.4...
Big difference! I didn't have to do much to get that - after
placing the parts most of the 3D models were already in place,
just had to add block models for the capacitors (picked from a
list). Initially I drafted the PCB using ExpressPCB, which is very
easy to use but it was going to cost >$300 to actually make the
boards, too expensive. Need Gerbers! Thought about using Altium
Designer but my install is set up mostly for surface mount stuff,
really didn't feel like going through the process of creating new
components and all the other stuff Altium makes me do before I can
even start. Recently there's been a lot of talk about KiCad so
gave it a try. To avoid dependency stuff I got the Windows version
and installed it in Win7 running under VirtualBox, time required
to learn how to use the software and create an initial version of
the board was about a day - much easier to learn than Altium. All
the parts I needed for this fairly simplistic board were in the
default libraries, and it's easy to change pad and hole sizes on
the PCB itself without having to make new footprints. When
importing netlist changes there's an option to preserve
footprints, no need to have to keep the schematic footprints in
sync. One thing took a bit to figure out - to create the
connection pads on the schematic I used 1 pin connectors then
fixed the footprint on the PCB. One oddity - in KiCad the Y axis
numbers increase from top to bottom, backwards from what I'm used
to. No big deal, just resulted in making the Gerber/drill Y
numbers negative, board houses don't care where the origin is so
long as everything lines up. It won't replace Altium for my work
stuff - for one thing it doesn't export .step files which I need
so the mechanical folks can make sure the board fits the case
(they use SolidWorks) - but for this project (a simple guitar
pedal going in a stock case) I didn't need that, Kicad got the job
done. Glad I gave it a shot because Seeed only wanted $40 to make
30 boards and half of that was shipping.
It would be nice to run KiCad natively but my old Ubuntu 12.04
won't support it - tried compiling but some of the required
libraries are too far out of date... that's one area where Windows
beats Linux. It doesn't have to be that way, would probably run
just fine if a binary package was offered that included all
dependencies, not exactly the "Linux Way" but I would much prefer
a more bloated install than having to run a Windows version in a
virtual machine.. works for Firefox and Google Earth and other
up-to-date programs and it's not like hard drive space and update
bandwidth are a precious resources these days. Would be nice if
apps could be more decoupled from the GUI... but it's..
complicated.
I like 12.04, newer versions not as much mainly because they're
often missing features I use or make me click more - File Edit etc
menus exist for a reason but often get removed to "simplify" the
interface. But it won't be long before I have to change - my 12.04
install is developing other problems. After a Nvidia driver update
the mutter WM stopped working. Along with Gnome Shell (not that I
ever use it). Thanks to Gnome 3's configurability I still have
Compiz and Metacity so still in business but it's just a matter of
time before 12.04 becomes too much to keep going. MATE is a
possibility but I hesitate - not sure how to define multiple
configurations in MATE, something like the window manager dying
would be a total GUI failure instead of a minor issue. Also MATE
renamed a lot of stuff (out of necessity) so if I tried to install
it beside Gnome I'd have dups of all the core apps. Leaning
towards staying with gnome-panel and doing whatever needs doing to
get it running on a modern core, probably 16.04. Tempted to just
image the disk (and work from the copy with the old disk put up in
case it goes bad), distro upgrade to 14.04 then 16.04, then fix
the broken bits.. might be easier than starting over.
11/14/17
I'm still here :-) just been busy. Been a year and I STILL
haven't upgraded.. still on 12.04. It's been lovely not having
updates break random stuff, the broken stuff (which I mostly don't
care about) is still broken but overall this system is more stable
than ever. Software-wise anyway, hardware-wise it's getting dated,
things falling apart. The backup battery or capacitor or whatever
keeps the CMOS alive is getting weak, after coming back from a
summer vacation the darn thing refused to boot. Among things I
tried I yanked the video card to revert to on-board graphics and
it came up in 640x480, totally wrecking my icon layout.. ended up
resetting the cmos to defaults did the trick. Tried to restore the
icon layout but couldn't figure out which backup file(s) to use,
and the ones I thought were "it" freaked things out more.. Gnome
(as is almost every modern system) is complicated, layers upon
layers. No matter, just got used to alphabetical icon arrangement.
Overall I still like Gnome's components, just not their default
UI. When the inevitable comes (probably in the form of a new
computer) I just hope I can get Gnome Panel and get
association/mime stuff like Assogiate running - being able to
control what happens when I double-click and define my own
right-click association menus is vital to my work flow.
Broke down and got a cheap Android "smart" phone.. $50 hehe got
it mainly for Lyft but it is kind of cool to surf the web and
stuff when I'm at work or away. It's a ZTE Z837VL running kernel
version 3.18.24 and Android version 6.0.1, only has 8 gigs of
flash so have to "beat stuff back" every now and then and move
accumulated cruft to a flash card I installed. Found a package
called mtp-tools with the commands mtp-files and mtp-getfile to
grab files off of it but the file system is really clunky,
somewhere in there is a normal Linux tree but it does its damndest
to keep me from it. Seems to have 2 partitions, a read-only system
partition and a read/write user partition. Although the core OS
will probably never update, apps update all the time and when they
do they use room from the read/write space. To recover have to go
to settings/apps, disable, remove then it insists on restoring the
stock version but at least I get some space back - don't need or
have any interest in FaceBook Messager Twitter and all that other
social media crap it keeps trying to shove in my face. Even the
Yahoo email app sucks - try to download an attachment and it
disappears into a black hole (probably taking up space somewhere I
can't see), have to use Chrome and click through the "why aren't
you using the app" stuff. PDF's were working but after beating
back a bunch of Google stuff I didn't want all I got was "can't
find the app to open this file" (was probably using a web service
before), so installed a PDF viewer (not Adobe), works again now.
But wow, this is the dumbed down user interface everyone uses and
influences the makers of desktop GUI's... yuk. Still it's cool to
surf the web on a $50 "nano" computer (plus another $49/month for
the privilage).
One thing that is very obvious since getting this phone - how
crappy the web has become without an ad blocker! Dangerous too.. I
try to only surf to sites I (sort of) trust, although I suppose
that's one major advantage of Android keeping the file system
locked up - on a PC pretty much everything is exposed, why it's so
important today to block stuff like cross-domain scripts. Browsers
are getting better but I hear about "drive by" infections all the
time, merely visiting a web site with a malicious ad is enough.
They don't target Linux (much, yet) but there's no fundamental
reason Linux can't be targeted.. scripts can drop files to any
directories exposed by the browser. All it takes is a browser
vulnerability and my entire user space is exposed (no need for
root - everything important to me is in user space, whereas OS
space is easily replaced and doesn't contain anything all that
important). The latest trend is some websites now block users who
use ad blockers, parden my language but **** them, I simply won't
visit such sites and have never whitelisted as they suggest - not
until they can demonstrate that they actually vet the ads they run
and are willing to be responsible for any malicious ads that sneak
through. Ad delivery networks are a major security hole, they
can't vet the ads and just grab crap from who knows where and
shoves it to my computer. No, just no. It's not the ads, if they
did it right (and a few web sites are starting to get it) then the
ad content would be indistinguishable from web content and
delivered from their own servers where they could vet the content
and could be held liable for distributing malware, so serving the
ads from their own site would mostly solve that issue. But alas
these days everyone wants free money and doesn't want to do any
extra work for it or be responsible for side effects. Another part
of the problem... when income depends on clicks it gives rise to
clickbait - those sites that block ad blockers rarely have
anything worth reading, and worse plagerize/regurgitate content
and not even link to the real source. I get the need to make money
but when your business model is basically tricking people into
clicking on useless stuff that might even cause damage, might want
to rethink that. The ads aren't the issue, it's the way they're
being delivered. Anyway back to work...
11/21/17
Here's a blassic script I made
to get jpg and pdf files off my phone without having to email them
to myself...
----- begin getfiles.blassic ---------------
#!/usr/local/bin/blassic rem copies select files from phone to current directory rem skips files that already exist rem uses mtp-files and mtp-getfile from the mtp-tools package rem specify extensions to copy in the string below... extensions$ = ".jpg .pdf .mp3 .mp4" dim fileids(20000) dim filenames$(20000) print "Scanning phone for files..." shell "mtp-files > filelist.tmp 2>/dev/null" open "filelist.tmp" for input as #1 index = 0 while not eof(1) line input #1, a$ if left$(a$,9) <> "File ID: " then goto nextline index = index + 1 fileids(index) = val(mid$(a$,10)) line input #1, a$ if left$(a$,13) <> " Filename: " then goto mtperror filenames$(index) = mid$(a$,14) label nextline: wend close #1 if index = 0 then print "No files found": goto cleanup print print "Found ";index;" files..." print for i = 1 to index print fileids(i),filenames$(i) next i print print "Copying ";extensions$;" files..." print for i = 1 to index a$ = inkey$: if a$ <> "" then system a = instr(extensions$,lcase$(right$(filenames$(i),4))) if a = 0 then goto nextfile shell "ls "+chr$(34)+filenames$(i)+chr$(34)+">/dev/null 2>/dev/null" if peek(sysvarptr+24) = 0 then goto nextfile: rem sysvar 24 is exit code mtp$ = "mtp-getfile "+str$(fileids(i))+" "+chr$(34)+filenames$(i)+chr$(34) print mtp$ shell mtp$+" >/dev/null 2>/dev/null" label nextfile: next i label cleanup: kill "filelist.tmp" system label mtperror: close #1 print "something is wrong with mtp-files output, check filelist.tmp" system
----- end getfiles.blassic -----------------
...hacky but works for me, lots better than manually typing
mtp-getfile commands. One hack that's often needed with blassic
scripts (and BASIC in general) is checking if a file exists,
previously I shelled a command that redirected a test result to a
file then opened and read the result. And keep track whether or
not the test was ever made to avoid an error when it tries to kill
a temp file that doesn't exist. Found a somewhat better way to do
it.. blassic sysvar 24 contains the exit code of the last shell
command, so a command that returns an error code for a
non-existing file argument can be used to test. I chose the ls
command. The technique is fairly simple...
shell "ls "+chr$(34)+file$+chr$(34)+">/dev/null 2>/dev/null"
if peek(sysvarptr+24) = 0 then goto file_or_directory_exists
...at least simpler than the way I was doing it, usually something like...
shell "{ if [ -e "+chr$(34)+file$+chr$(34)+" ];then echo y;else echo n;fi } > file.tmp"
open "file.tmp" for input as #3:input a$:close #3:kill "file.tmp"
if a$ = "y" then goto file_or_directory_exists
Blassic (and for that matter BASIC) has its warts but its still
my go to language (ha) for quick hacks like this.. previously was
worried it might go away but version 10.0.3 was last modified in
2016, fixing an issue with newer versions of gcc. Seems to work
ok... graphics works now! Previously it couldn't find the X
libraries and compiled to a text-only version.
12/2/17
A common thing to do in electronics is to measure or calculate a
resistor value, then match it to the closest stock value. Before I
was looking it up on a chart but that can get tedious, especially
when a lot of measurements have to be converted. So wrote this
blassic script...
------- begin closestR.blassic -------------------------------
#!/usr/local/bin/blassic rem a program for finding standard resistor values rem by WTN, last mod 20171206 print "=== Resistor Value Finder ===" print "Finds the closest stock 1% and 5% resistor values." print "Entry can include K or M suffix, output is standard" print "resistor notation. Enter an empty value to exit." dim valueE96(97),valueE24(25) rem E96 values, extra decade value at the end to simplify code data 100,102,105,107,110,113,115,118,121,124,127,130,133,137 data 140,143,147,150,154,158,162,165,169,174,178,182,187,191 data 196,200,205,210,215,221,226,232,237,243,249,255,261,267 data 274,280,287,294,301,309,316,324,332,340,348,357,365,374 data 383,392,402,412,422,432,442,453,464,475,487,499,511,523 data 536,549,562,576,590,604,619,634,649,665,681,698,715,732 data 750,768,787,806,825,845,866,887,909,931,953,976,1000 for i=1 to 97:read valueE96(i):next i rem E24 values+decade data 10,11,12,13,15,16,18,20,22,24,27,30 data 33,36,39,43,47,51,56,62,68,75,82,91,100 for i=1 to 25:read valueE24(i):next i label entervalue: line input "Desired value: ",desired$ desired$ = ltrim$(rtrim$(ucase$(desired$))) if desired$ = "" then goto exitprogram mult = 1:num$ = desired$ if right$(desired$,1) = "K" then mult = 1000:num$=left$(num$,len(num$)-1) if right$(desired$,1) = "M" then mult = 1000000:num$=left$(num$,len(num$)-1) rem blassic's val() ignores trailing invalid characters so validate manually rem num$ must contain only 0-9 and no more than one decimal point E = 0:E1 = 0:PC = 0 for i = 1 to len(num$) a$=mid$(num$,i,1) if asc(a$) < asc("0") or asc(a$) > asc("9") then E1 = 1 if a$ = "." then E1 = 0:PC = PC + 1 if E1 = 1 then E = 1 next i if E = 0 and PC < 2 then goto entryok print "Don't understand that, try again" goto entervalue label entryok: rem calculate desired value from string and multiplier desiredR = val(num$) * mult if desiredR >= 0.1 and desiredR <= 100000000 then goto valueok print "Value must be from 0.1 to 100M" goto entervalue label valueok: rem determine multiplier to convert stored values norm = 0.001 if desiredR >= 1 then norm = 0.01 if desiredR >= 10 then norm = 0.1 if desiredR >= 100 then norm = 1 if desiredR >= 1000 then norm = 10 if desiredR >= 10000 then norm = 100 if desiredR >= 100000 then norm = 1000 if desiredR >= 1000000 then norm = 10000 if desiredR >= 10000000 then norm = 100000 rem determine lower value match, upper match is one more rem compare to a slightly smaller value to avoid FP errors for i = 1 to 96 if desiredR > valueE96(i)*norm-0.00001 then v1 = i:v2 = i+1 next i lowerE96 = valueE96(v1)*norm upperE96 = valueE96(v2)*norm rem do the same for E24 series, using norm*10 since E24 values are 2 digit for i = 1 to 24 if desiredR > valueE24(i)*norm*10-0.00001 then v1 = i:v2 = i+1 next i lowerE24 = valueE24(v1)*norm*10 upperE24 = valueE24(v2)*norm*10 rem calculate error percentages for lower and upper values lowerE96error = (1-lowerE96/desiredR)*100 upperE96error = -(1-upperE96/desiredR)*100 lowerE24error = (1-lowerE24/desiredR)*100 upperE24error = -(1-upperE24/desiredR)*100 rem determine which value has less error rem in the event of a tie go with the higher value (user can pick) closestE96=lowerE96:if lowerE96error>=upperE96error then closestE96=upperE96 closestE24=lowerE24:if lowerE24error>=upperE24error then closestE24=upperE24 rem print the closest value and error percentages for lower/upper values print "Closest E96 value = "; R = closestE96:gosub convertE96:print R$;space$(7-len(R$));"("; rem to detect exact matches compare to a range to avoid float errors M=0:if closestE96 > desiredR-0.0001 and closestE96 < desiredR+0.0001 then M=1 if M=1 then print "exact match)": goto printE24values E$=left$(str$(lowerE96error+0.0001),4) R = lowerE96:gosub convertE96:print "-";E$;"%=";R$;","; E$=left$(str$(upperE96error+0.0001),4) R = upperE96:gosub convertE96:print "+";E$;"%=";R$;")" label printE24values: print "Closest E24 value = "; R = closestE24:gosub convertE24:print R$;space$(7-len(R$));"("; M=0:if closestE24 > desiredR-0.0001 and closestE24 < desiredR+0.0001 then M=1 if M=1 then print "exact match)": goto doneprintingvalues E$=left$(str$(lowerE24error+0.0001),4) R = lowerE24:gosub convertE24:print "-";E$;"%=";R$;","; E$=left$(str$(upperE24error+0.0001),4) R = upperE24:gosub convertE24:print "+";E$;"%=";R$;")" label doneprintingvalues: goto entervalue:rem loop back to enter another value label exitprogram: system rem subroutines to convert R value back to standard notation rem input R containing resistor value (with possible float errors) rem output R$ containing value in standard resistor notation label convertE96: R$="error":R2$="":R1=R+0.00001:R2=R if R1 >= 1000 then R2 = R1/1000:R2$ = "K" if R1 >= 1000000 then R2 = R1/1000000:R2$ = "M" if R2<1 then R$ = left$(str$(R2+0.00001)+"000",5) if R2>=1 and R2<100 then R$ = left$(str$(R2+0.00001)+"000",4)+R2$ if R2>=100 and R2<1000 then R$ = left$(str$(R2),3)+R2$ return label convertE24: R$="error":R2$="":R1=R+0.00001:R2=R if R1 >= 1000 then R2 = R1/1000:R2$ = "K" if R1 >= 1000000 then R2 = R1/1000000:R2$ = "M" if R2<1 then R$ = left$(str$(R2+0.00001)+"00",4) if R2>=1 and R2<10 then R$ = left$(str$(R2+0.00001)+"00",3)+R2$ if R2>=10 and R2<100 then R$ = left$(str$(R2),2)+R2$ if R2>=100 and R2<1000 then R$ = left$(str$(R2),3)+R2$ return
------- end closestR.blassic ---------------------------------
[12/20/17 - minor update to use line input instead of just input]
Works like this...
=== Resistor Value Finder ===
Finds the closest stock 1% and 5% resistor values.
Entry can include K or M suffix, output is standard
resistor notation. Enter an empty value to exit.
Desired value: 45.5k
Closest E96 value = 45.3K (-0.43%=45.3K,+1.97%=46.4K)
Closest E24 value = 47K (-5.49%=43K,+3.29%=47K)
Desired value:
Seems like an easy thing to program - store the base table in an
array, figure out the scaling factor then find the closest values
- but as usual, those pesky details. I don't want to see 45300,
needs to say 45.3K, getting it to display correctly for all cases
took a bit of hacking. Another issue is floating point
representation error - a number like 1000 might end up 999.999999
after computations causing if something >= 1000 to fail, that's
why the code is full of +0.00001 to ensure numbers are pushed back
past the desired value and will display correctly after converting
to strings. Similarly, can't do equal comparisons with floats,
have to check if it's in a close range. When finding the lower
match, has to subtract a bit from the table value otherwise it
might be comparing say 82 to 82.000001 and miss the lower value.
Basically the rule is never count on floats being equal unless the
number is a small integer and no significant math has been done on
it. A=3: ... :IF A=3 THEN is fine, but A=A*1000: ... :IF
A=somevalue THEN might fail.. the key word here is might...
testing in immediate mode will work fine until you count on it in
a program. Do something like IF A>value-0.0001 AND
A<value+0.0001 THEN instead. This isn't a problem with blassic,
same considerations apply to all programming languages that use
binary floating point, embrace it or convert everything to long
ints but that's often messier than just recognizing the
limitations and adding offsets as needed.
1/2/18 - Another handy blassic script... Updated again...
----------------- begin convmm.blassic --------------------------------
#!/usr/local/bin/blassic rem convmm 180102 rem Converts between inches and millimeters.. rem 171219 - initial version rem 171228 - added rounding, better input validation rem 171230 - added toggle for rounding, doesn't show the rem menu every time (press ESC to redisplay), rem automatically calculates rounding factor, rem doesn't round/truncate if output contains "e" rem 180102 - adustable digits, better command input rem adjust for rounded output format... mmdigits = 4:rem initial max digits after dp for mm indigits = 5:rem initial max digits after dp for inches rounding = 1:rem initial rounding state print "Metric/English Measurement Converter" label selectfunction: print "M) mm to inch I) inch to mm R) rounding D) Digits Q) quit" label getfunctionkey: print ">"; a$ = input$(1) print chr$(8);" ";chr$(8); if a$ = "q" then goto exitprogram if a$ = "m" then goto mmtoinch if a$ = "i" then goto inchtomm if a$ = "r" then goto togglerounding if a$ = "d" then goto changedigits if a$ = chr$(27) then goto selectfunction goto getfunctionkey label togglerounding: if rounding = 0 then rounding = 1 else rounding = 0 print "Rounding is "; if rounding = 0 then print "off" else print "on" goto getfunctionkey label changedigits: input "MM digits: ",a$ gosub validatenumber if a$ = "" then goto changedigits2 if ok = 0 or val(a$) < 0 or dpc > 0 then goto invalidnumber mmdigits = val(a$) label changedigits2: input "Inches digits: ",a$ gosub validatenumber if a$ = "" then goto getfunctionkey if ok = 0 or val(a$) < 0 or dpc > 0 then goto invalidnumber indigits = val(a$) goto getfunctionkey label mmtoinch: digits = indigits if rounding = 0 then digits=12 line input "MM? ",a$ gosub validatenumber if a$ = "" then goto getfunctionkey if ok = 0 then goto invalidnumber n = val(a$)/25.4 gosub printnumber goto getfunctionkey label inchtomm: digits = mmdigits if rounding = 0 then digits=12 line input "Inch? ",a$ gosub validatenumber if a$ = "" then goto getfunctionkey if ok = 0 then goto invalidnumber n = val(a$)*25.4 gosub printnumber goto getfunctionkey label invalidnumber: print "invalid" goto getfunctionkey rem make sure input is a valid number label validatenumber: ok = 1 : dpc = 0 a$ = ltrim$(rtrim$(a$)) if a$ = "" then return for i = 1 to len(a$) b = asc(mid$(a$,i,1)):rem digit must be 0-9 or dp if (b<48 or b>57) and b<>46 then ok = 0 if b = 46 then dpc = dpc + 1:rem count dp's next i if dpc > 1 then ok = 0:rem too many dp's return rem print number n label printnumber: rem don't process if in scientific notation... rem (happens if number is < 0.0001) if instr(str$(n),"e") > 0 then print n:return n = n + val("5e-"+str$(digits+1)):rem round lsd n$ = str$(n) dp = instr(n$,"."):rem position of dp if dp = 0 then print n$:return:rem just print if no dp n$ = left$(n$,dp+digits):rem truncate l = len(n$) : tz = 0 : rem remove trailing zeros.. for d = l to dp step - 1 t$ = mid$(n$,d,1):if t$<>"0" and t$<>"." then tz = 1 if tz = 0 then n$ = left$(n$,len(n$)-1) next d print n$ return label exitprogram: system
----------------- end convmm.blassic ----------------------------------
Sample run...
Metric/English Measurement Converter
M) mm to inch I) inch to mm R) rounding D) Digits Q) quit
MM? 160
6.29921
Rounding is off
MM? 160
6.299212598
Inch? 6.299212598
160
MM digits: 2
Inches digits: 3
Rounding is on
MM? 160
6.299
>
The main functions are simple enough, inches=millimeters/25.4 and
millimeters=inches*25.4, but as usual it's about the details and
how nice to make it. Most of the niceness happens in the
printnumber subroutine, which takes a number n and a digits
variable specifying the maximum digits after the decimal
point. Before truncating it adds a certain amount for proper
rounding and also to avoid printing 2.999999999 instead of 3. The
rounding off function makes digits=12 for max resolution but still
enough is added to prevent that effec. After truncating, any
trailing 0's after the decimal point are removed and the resulting
string is printed. There are a couple exceptions to avoid errors -
if the number ends up in scientific notation (usually from being
less than 0.0001) then it just prints the number, and if after
rounding the result is an integer it just prints it. The command
input section uses blassic's input$(n) function to wait for and
get keystrokes, originally I used inkey$ (out of habit) but that
wastes CPU time. The menu doesn't show every time, instead shows a
">" prompt that's erased when a key is pressed.. press ESC to
redisplay the menu.
5/1/18
Still here :-) Still on 12.04 :( (but it ain't so bad! life is
peaceful). One thing I had to do is for VirtualBox moved to using
the .run file installs as the .deb installs are no longer
available for 12.04 and as of late they were buggy anyway, likely
my outdated system libraries. The .run files install to /opt/ and
as far as I can tell work perfectly, they include all major
dependencies so no more library-induced instabilities, and as a
bonus automatically uninstall previous .run versions when
upgrading.. download, make executable, run in a terminal, done.
Would be nice if all major apps came that way (many do), including
dependencies is much less buggy - depending on system libraries
has always been an issue for desktop Linux because what the app
developers use and what the system has are rarely the same and
that causes unpredictable and sometimes untestable bugs, and these
days the extra disk space is rarely an issue (not for
user-installed apps anyway, OS-included apps should still use
system libraries to keep the OS itself compact). Newer OS's take
the concept a step further and also containerize or sandbox the
apps too (snap etc).
Ubuntu 18.04 is getting good reviews, so when I do upgrade will
probably jump to that an LTS at a time. I do NOT want to install
from scratch, my system is my world, it has taken years to get
right for what I do and disrupting that is not on the table - I
just want to jack it up and insert an new OS under it that
hopefully works about the same. The 18.04 repositories include
Gnome Panel so should be no problem setting up my existing
Gnome-2-like setup - with any luck I won't have to set anything up
at all and it'll just keep it like it is. My biggest beef with the
upgrade process is how it removes software if it no longer is in
the repositories - that causes lots of breakage with custom apps
and removes stuff I need. I wish I could just tell the installer
to leave outdated apps and old .so's alone.
One example is assogiate - a simple mime/file types editor. It
might be from 2007 but it works fine and performs a vital function
that I occasionally need, but it no longer exists in any supported
Ubuntu repository. Don't need it all the time but when I need it I
really really need it. To my knowledge (unless a suitable
replacement has been developed) the alternative is figuring out
very complex data structures and manually editing config files (or
booting into another DE like KDE that supports that as built-in
functionality), last thing I want to do when I have work to do and
it involves defining a new file type. I suppose before upgrading
will have to manually copy out it and its dependencies and
anything else I want to keep then restore it after the upgrade. Or
figure out how to pin stuff so upgrades won't touch it.
Anyway, the upgrade will eventually have to happen.. at the
moment the precise repositories are still up if I need something
but that won't last forever, new stuff I might want to use won't
be available, and eventually my old FireFox and Chrome will become
incompatible with newer web sites. So... basically the plan will
be to get a couple new drives - a 2T one for a new main drive,
another (maybe 3T) to become a new backup drive. My backup drive
is basically a complete copy of the root filesystem, usually
copied while the system is running - not an ideal way to make
backups but when copied to a fresh drive and grub is reinstalled
it will boot with minimal complaints. Basically, take the old boot
drive out, goes on a shelf in case something goes wrong, copy
files to the new drive, fix it to make it boot again. Remove
backup drive, on the shelf it goes, make new backups on a new
backup drive. At that point I'm back to my old system but with
more space and I can screw it up all I want without risking my old
system, now sitting on a shelf. Then do the upgrade shuffle, spend
a few days fixing the mess but in my past experiences with LTS
upgrades generally the OS itself continues to work fine, more a
matter of replacing the apps it removes and making my custom stuff
work again. Temped to try going straight from 12 to 18 using CD
install medium.. last time I tried that there was an option to
preserve /home so just had to copy out /usr/local and /opt and
restore them afterwards along with a few files in /etc, result was
essentially a fresh OS with old user data and a few apps
complaining about old config files. Regardless, with (multiple)
backup files if it doesn't work can try something else. Even if it
doesn't work at all 12.04 isn't that bad... got reminded of that
yesterday when I had to do field work and tried to use Windows 10
to get work done - didn't work.. wasted half an hour trying to
update itself, came back with broken WiFi, couldn't read some
thumbdrives, tried to install some XP software I needed but that
was a total bust, VirtualBox wouldn't boot my old XP virtual
machine but probably can fix that with settings changes - but at
that point my boss pointed me to an old XP machine in the back he
kept around for such occasions, that worked. New is not always
better.
Finally a rant that has nothing to do with operating systems..
the war on ad blockers and web sites wanting to monitize personal
information. Seriously these web sites need to get with the
program or die. There is and always has been a way to have ads
that are not blocked and that do not bother me a bit - just put
them in line with the content or off to the side
(indistinguishable from content so not blocked). Instead of trying
to track people's interests, take a hint from where the surfer is
at and show related stuff. But instead these web sites want to use
cross-scripted malware-infected info-stealing ad networks that
pretty much make the sites unusable. Most sites like that aren't
that good anyway. The current ad revenue system is unsustainable
and users simply don't want it, and I don't want to have to "sign
up" for every web site I visit - got more logins than I can deal
with as it is, not signing up just to read articles and news that
I can get from non-commercial sources. So now anytime one of those
sites nag me about it I just delete the bookmark and never visit
them again. Ad blocker stays because it's basically a
malware/tracking blocker, and now that browsers are starting to
include ad-blocking by default they will have to change how they
deliver their ads, figure out some other way to make money, or
maybe just get off the web.
Speaking of blocking... soon I need to figure out the https
thing... not that I think it should be needed for an info-only
site like mine, but these days with so many ISP's modifying
content in transit going encrypted is the only way to make sure
companies aren't inserting ads or worse into my stuff.
11/12/2019
Been over a year since the last update - I guess didn't much to
write about. But time to start thinking about my future computing
environment, Ubuntu 12.04 still works great for my working
environment - as in the doing my job part - but is falling behind
in the media department. Chrome no longer supports 12.04, can run
a new version of firefox but it's out of band (generic), no
support for system themes and beyond ugly, and messes up my
existing firefox setup so running the new version just for some
things isn't practical. So no Netflix and Hulu or other things
that depend on later versions. Still does fine for general surfing
and it's fairly locked down (along with the rest of my system) so
not particularly worried about hackage. I've been watching
Netflix/Hulu using Windows 7 in VirtualBox but there's audio
glitching and I'm not able to run the latest version of VirtualBox
due to stability issues. I can't afford to be down too much but
something will have to give soon - probably get a new PC with a
new OS and run both systems until I migrate to the new OS.
I tried Ubuntu 18.04 in VirtualBox...
Basically just had to add gnome-panel and a couple of tweaking
tools to get a baseline-usable operating system. Hulu and Netflix
work well - the audio is much improved, don't notice hardly any
glitching. Gnome Shell works much better now, almost no overhead.
The unity-like side bar is functional but it's still easier for me
to operate using Gnome Panel - one click to change between windows
versus not knowing what's open until I click the side bar and
select the window. Still has fully-functioning icons on the
desktop but this is probably the last version with proper desktop
icons (as in works like a regular file manager directory that
remembers positions), desktop functionality has been removed from
Nautilus. So.. good for 3.5 more years then will have to switch to
MATE or something that has real desktop functionality. For now
it'll probably do, but it's certainly more dumbed down than my
current 12.04 system.
By default there's no New File option until a file is put in
~/Templates, and it doesn't execute scripts until told to prompt.
No biggie. I find myself doing Alt-F s to save but that's been
removed along with menu bars in most apps. What menu options
remain are on the top Gnome Shell bar, from what I gather if not
running Gnome Shell that menu is just gone. Another somewhat
irritating deletion is now scripts don't show up unless a file is
selected, making it clunky to run scripts that operate on all
files (PlayAll, RenameTracks, MakeNewLauncher etc) and impossible
to run scripts from the GUI that change behavior based on if a
parm is supplied. And no official support for Nautilus-Actions to
make up for the lack of parm-less scripts. I don't understand the
reasoning behind this change, was simpler and worked better the
way it was - they added more logic just to hide a useful feature.
Assogiate and other mime filetype editors no longer work (missing
dependencies), the only stock way to add new filetypes is to drop
XML files in ~/.local/share/mime/packages and run
"update-mime-database ~/.local/share/mime". Kind of tedious so
wrote a Nautilus script...
------- file AddToFileTypes ----------------------------
#!/bin/bash # AddToFileTypes script for Nautilus Scripts 191112 # Right-click file and run this script to create a new mime filetype # based on the extension. If no type is entered then it prompts to # remove the filetype if a previous type was defined using this script. # If a type is entered then prompts for the comment field and whether # or not to keep the parent file type. Press F5 to update the # file manager to pick up the new type. Changes are local, delete # ~/.local/share/mime to undo all changes. mimedir="$HOME/.local/share/mime" packagesdir="$mimedir/packages" filename="$1" extension="${filename##*.}" if [ "$extension" != "" ];then mkdir -p "$packagesdir" mimexmlfile="$packagesdir/customtype_$extension.xml" newtype=$(zenity --title "Add new filetype for *.$extension files" \ --entry --text "New filetype... (for example text/sometype)\n\(clear to prompt to remove type)") if [ "$newtype" = "" ];then if [ -e "$mimexmlfile" ];then if zenity --title "Add new filetype for *.$extension files" \ --width 350 --question --text "Remove existing filetype?";then rm "$mimexmlfile" update-mime-database "$mimedir" fi fi else # new type specified if [ -e "$mimexmlfile" ];then #remove existing xml first and update rm "$mimexmlfile" #to pick up the parent filetype update-mime-database "$mimedir" fi oldtype=$(mimetype "$1"|cut -d: -f2|tail -c+2) > "$mimexmlfile" echo "<?xml version=\"1.0\" encoding=\"UTF-8\"?>" >> "$mimexmlfile" echo "<mime-info xmlns=\"http://www.freedesktop.org/standards/shared-mime-info\">" >> "$mimexmlfile" echo " <mime-type type=\"$newtype\">" comment=$(zenity --title "Add new filetype for *.$extension files" \ --width 350 --entry --text "Enter description for comment field...") if [ "$comment" != "" ];then >> "$mimexmlfile" echo " <comment>$comment</comment>" >> "$mimexmlfile" echo " <comment xml:lang=\"en_GB\">$comment</comment>" fi if zenity --title "Add new filetype for *.$extension files" \ --width 350 --question --text "Keep parent type $oldtype?";then >> "$mimexmlfile" echo " <sub-class-of type=\"$oldtype\"/>" fi >> "$mimexmlfile" echo " <glob pattern=\"*.$extension\"/>" >> "$mimexmlfile" echo " </mime-type>" >> "$mimexmlfile" echo "</mime-info>" update-mime-database "$mimedir" fi fi
------- end AddToFileTypes -------
This is simplistic and can only create file types based on the
extension, but that covers most of what I need - I just need a way
to add context-sensitive associations to avoid adding unrelated
apps to say every text file. To use, right-click a file and the
script prompts for the filetype, comment and whether or not to
keep the parent filetype. For say a .red file it creates the file
~/.local/share/mime/packages/customtype_red.xml with contents like
this...
<?xml version="1.0" encoding="UTF-8"?> <mime-info xmlns="http://www.freedesktop.org/standards/shared-mime-info"> <mime-type type="text/x-redcode"> <comment>Redcode Warrior</comment> <comment xml:lang="en_GB">Redcode Warrior</comment> <sub-class-of type="text/plain"/> <glob pattern="*.red"/> </mime-type> </mime-info>
...then runs update-mime-database to enable the changes. The same
utility can be used to remove file types, if the XML file already
exists then if nothing is entered it prompts to remove the XML
file and update to revert back to the previous type. The
AddToApplications script elsewhere on this page can be used to add
custom applications to the list of programs a file type can be
associated with.
4/14/2020 (edited 4/15/2020) - Trying out the Ubuntu 20.04 beta
in VirtualBox...
That's after a bit of tweaking.. installed gnome-panel (start with
gnome-panel --replace) and other stuff similar to how I set up my
test 18.04 system. Also needs libavcodecs-extra to make Hulu
Netflix etc work. One big difference is Nautilus no longer handles
the desktop - that party is over. The good news is Gnome Panel
still works fine and at least it has a minimal desktop icon system
provided by the desktop-icons Gnome Shell extension... the funny
tilted shadows under the icons showed up after enabling 3D
accelleration in VirtualBox - don't do that, kills performance.
The default icon spacing was too much for my liking but only took
a few minutes to find and edit an obviously-named file (prefs.js)
to fix that.
The not-so-good news is the desktop no longer works like it was a
file manager - with previous versions of Gnome the desktop was
provided by the Nautilus file manager so almost anything that
worked in a file manager pane also worked on the desktop. The
extension mirrors files, folders and symlinks placed in the
~/Desktop folder, but for the most part you have to open the file
manager to manipulate desktop files, dragging files directly to
the desktop is not supported. But you can drag files off the
desktop to an open file manager window, or to another folder icon
on the desktop, and right-clicking shows a few options. The
desktop-handling code is now plain javascript, so there's the
possibility of adding more functionality. App launchers (desktop
files) are not supported [... not true - right-click a launcher
and select Allow Launching, not sure if it was always there or got
added with an update, but after playing around with it on my new
system the new desktop extension works. My Ubuntu session launches
stock and I have a couple launchers on the desktop that run
scripts to enable/disable the Caja desktop and Gnome Panel as
needed.]
One solution is Caja, the file manager from MATE. For full
functionality also need mate-desktop-environment-core and
caja-wallpaper. Also caja-open-terminal and caja-admin are handy.
Start Caja with the command caja -n --force-desktop by whatever
means, the command killall caja restores the stock Ubuntu desktop
(for testing made a couple gnome panel launchers, since replaced
with separate sessions). The Caja desktop has a right-click to
create a custom launcher on the desktop, or drag an existing
launcher from the Gnome-Panel menu. The caja file manager displays
and runs desktop files won't directly copy a desktop file to the
desktop (or anywhere else, neither did the Gnome 2 Nautilus),
however Nemo (another enhanced file manager) will copy .desktop
files. For that matter so does cp.
This can work...
Getting there!
Originally I was adding the gnome-panel --replace and the caja -n
--force-desktop commands to Startup Applications but this
seriously interferes with other sessions... after pulling in much
of the MATE core to get a usable desktop, might as well add a few
more things to make it a usable session. The environment variable
$DESKTOP_SESSION contains the currently selected session so a
script can be added to Startup Applications to selectively start
components. The sessions available from the logon screen are
stored in /usr/share/xsessions, copied the existing ubuntu.desktop
to myubuntu.desktop and edited it to change the name to MyUbuntu
and change the session to myubuntu. Gnome sessions are stored in
/usr/share/gnome-session/sessions, copied ubuntu.session to
myubuntu.session and edited the name to MyUbuntu. Haven't figured
out yet how to add components/commands directly to the session
files but don't need to, just made a mystartupapps.sh script...
#!/bin/bash if [ "$DESKTOP_SESSION" = "myubuntu" ];then # startup apps for customized ubuntu session caja -n --force-desktop & gnome-panel --replace & fi if [ "$DESKTOP_SESSION" = "ubuntu" ];then # startup apps for stock ubuntu session gnome-panel --replace & fi
...and added it to Startup Applications. Sections can be added as
needed for additional sessions, but if no commands present then
the entire if/fi block must be commented out.
To get a usable MATE session, in addition to the Caja packages
already added to get a desktop, also added mate-backgrounds,
mate-applets, mate-tweak, mate-indicator-applet and
mate-notification-daemon. After a bit of setup and tweaking...
[and more tweaking]
This is encouraging. I really need to upgrade, but have been
hesitant, afraid I'd lose too much functionality. Instead I've
been doing my media consumption and stuff that needs more
up-to-date software in VM's like these, leaving my work system to
do what it does. From what I'm seeing here, it doesn't take much
to modify a stock Ubuntu 20.04 system into something that works
for me. I'm impressed that the various software components work
well with each other - Ubuntu's side bar extension resizes to
accommodate the bottom task bar and Gnome Shell doesn't mind when
caja handles the desktop without having to disable anything or
losing existing functionality. There are choices.
4/15/20
Of course Linux wouldn't be Linux if there wasn't something to
gripe about! And that thing today is this new format called
"snap". I tried it, installed a few snaps to test... Magic8ball,
LibrePCB, Kreversi and another simple reversi game. The simple
reversi game started but the screen was full screen and way too
big, no window or any way to resize. Kreversi wouldn't start at
all. Magic8ball was a terminal app but did not make a app/menu
entry, had to guess the binary name but it worked. LibrePCB
started up, at least to the opening screen. So... one and a half
out of 4, no wonder the app store is full of "doesn't work" and
"what do I do" comments. Uninstalled all of them.
The issues are numerous and quite serious ...
The package store does not list the installed components,
operating instructions or anything else regarding what to do once
the package is installed.
Where are these snap packages actually located on disk? The
directories under /snap declare 0 bytes on disk but when that
non-functional KDE run time is mounted it lists almost a gigabyte
of files. It's somewhere and it seriously stomped on my VM's
smallish virtual disk. Worse, it was not removed when I
uninstalled the game. Similarly, there's over 2 gigs of files in
two different Gnome runtimes and I have nothing installed besides
the app store itself.
The overhead of having extra file systems mounted that aren't
being used... after installing and uninstalling those four
packages...
Filesystem Size Used Avail Use% Mounted on
udev 1.9G 0 1.9G 0% /dev
tmpfs 394M 1.4M 393M 1% /run
/dev/sda5 25G 8.3G 16G 36% /
tmpfs 2.0G 0 2.0G 0% /dev/shm
tmpfs 5.0M 4.0K 5.0M 1% /run/lock
tmpfs 2.0G 0 2.0G 0% /sys/fs/cgroup
/dev/loop1 55M 55M 0 100% /snap/core18/1705
/dev/loop2 222M 222M 0 100% /snap/gnome-3-34-1804/21
/dev/loop0 94M 94M 0 100% /snap/core/8935
/dev/loop3 49M 49M 0 100% /snap/gtk-common-themes/1474
/dev/loop4 241M 241M 0 100% /snap/gnome-3-34-1804/24
/dev/loop6 50M 50M 0 100% /snap/snap-store/357
/dev/loop5 261M 261M 0 100% /snap/kde-frameworks-5-core18/32
/dev/loop7 50M 50M 0 100% /snap/snap-store/385
/dev/sda1 511M 4.0K 511M 1% /boot/efi
VB-Share2 451G 394G 57G 88% /media/sf_VB-Share2
tmpfs 394M 24K 394M 1% /run/user/1000
...yikes! at least the compressed size isn't as much as the
mounted size but still that's still 1022M of disk space when
there's nothing installed or being used. So much for the claim of
clean uninstalls!
Now to figure out how to reclaim my disk space... there's snap list --all and snap remove packagename but no indication of what's actually used or not, removed the larger packages I didn't think I needed anymore and it killed the Software Store app dead. No matter, can't ever imagine actually using a snap package, at least not in a VM, so used snap remove to remove all it would let me then fired up Synaptic to remove snapd. Got back 1.3 gigs and df looks normal.
I get the desire and need to have a way to package apps so they
can run unchanged on a variety of Linux systems, but this ain't
it, at least not in the form I witnessed. For one thing if almost
every app ends up using its own runtime version, why not just
package the run-time files it actually needs (often a much smaller
subset of the full stack) with the app itself? Numerous apps do
this with plain tar.gz files that include dependencies, including
major apps like FireFox. Could be done with dynamically mounted
file systems to save the extract step and keep the binaries
compressed if worried about that. At the very least it needs a way
to safely manage the runtimes and offer to remove them when the
last package needing them is removed.
4/16/20
Playing around with the flashback session...
This is nice! The default desktop is handled by gnome-flashback,
it's a step up from the Gnome Shell javascript extension in that
you can drag and drop to copy files directly to the desktop but
not symlinks, they work but have to copy them to the ~/Desktop
folder and they don't display with the usual symlink symbol. It
does support application launchers and correctly displays the icon
but the options to freely arrange the desktop are grayed out and
it doesn't show mounted volumes. It's a start and good to see
action in this direction - in a pinch I could probably make it
work but the Caja desktop is better. At first the caja -n
--force-desktop command didn't work but the trick was to use
dconf, navigate to org|gnome|gnome-flashback and turn the desktop
off, now it works fine when started from my startup script. To go
back to the flash-back desktop run the command killall caja and
flip on the desktop in dconf. To reenable the Caja desktop flip
the desktop off in dconf and run caja -n --force-desktop from
Gnome Panel's run applet. Anything done to Gnome Panel in the
flashback sessions affects all other sessions using Gnome Panel,
including adding the top panel. If kept under 24 pixels or so the
top panel hides behind Gnome Shell's top bar and doesn't
interfere.
4/17/20
Seems to come down to the Caja-enabled Flashback or MATE -
Flashback is a bit more lightweight, MATE has more stuff to play
with, but either would probably work fine for a work system. At
the moment leaning towards MATE, no hacking needed beyond normal
setup and it has a nice applet for adjusting the theme - can mix
windows from one theme with the controls of another. Ran into a
slight theme glitch with Caja - when using dark themes the cursor
disappears when renaming files... white cursor on white background
and it doesn't blink. If it doesn't fix itself shouldn't be that
hard to hack the theme files and fix it myself.
Some fun stuff... playing with a game called Endless Sky that I found in the repository (version 0.9.8)...
It can be but it doesn't have to be an action game, I fly around
(mostly on autopilot) in my cheap shuttle running passengers and
cargo, accumulating credits so I can get a better ship. If I want,
right now I'm a peon and they mostly leave me alone. Bought a
laser but didn't use it - anything that could hurt me blows me
away before I can even turn around - traded it for better
shields. It's more about planning routes and missions, no real
goal besides make more credits to get better stuff. Of course the
better your stuff the more they want to take it... [a thing I
discovered.. if caps-lock is engaged when taking off then the ship
moves faster]
Back to computing. One area of concern is support for 32-bit
binaries, there's been rumblings of limited 32-bit support. Turns
out it's not an issue as far as I can tell, at least not for the
simplistic binaries I might need to run. On the stock system if a
32-bit binary is run from the terminal it just says "file not
found", not too helpful but it's just the system saying it doesn't
know what to do with the file. First step is to enable the i386
architecture...
sudo dpkg --add-architecture i386
...then add the needed 32-bit libraries. In the Synaptic package
manager, click Architecture then select i386 - there's a ton of
libraries available. For starters, install libc6:i386,
libstdc++:i386, libx11:i386, libreadline5:i386 and
libncurses5:i386. That was enough to run 32-bit binaries compiled
by the old FreeBasic and run old pmars binaries I compiled in
2009. If a library is missing it will tell you when running from a
terminal. Or use the ldd command on the binary to see what it
needs. [ I think the fuss was that 32-bit binaries wouldn't be
supported at all, as far as I can read through the noise the dev's
just didn't want the burdon of maintaining up-to-date versions of
32-bit libraries other than the most common ones. Which is fine
since most of the libraries needed to run old 32-bit code are
among the common ones or haven't been updated in years anyway. ]
There is now a 64-bit version of FreeBasic, after adding
libtinfo5 (in addition to the dependencies mentioned in the
readme) it works fine, so don't really need to run the 32-bit
FreeBasic binaries, now I can just recompile my source for 64-bit.
For my Stars program had to
slightly modify the source to avoid compiler warnings, even in qb
mode it doesn't approve of putting global arrays inside a code
block, copied the redims to near the top of the program and
changed redim to dim. Even with the compiler warnings the
resulting binary still worked, just a cosmetic thing.
4/19/20 - Running out of compatibility checks! copied most of my
custom /usr/local/bin and /usr/local/share work and play stuff
as-is from my 12.04 system and everything seems to work fine, all
I had to do was install xterm. To set up custom file types used my
AddToFileTypes script, usually only needed for source code as it's
all plain text but I don't want programming and specialized tools
showing up in the right-click menus of every text file. So far
just had to define new types for .red .sim and .c files. To make
custom scripts show up in the association dialogs used my old
AddToApplications script.. not always necessary with Caja since it
supports associating files with custom commands but the script
allows specifying a custom name and whether or not to run in a
terminal. Both scripts are listed on this page.
Here's a neat script I made, a
general-purpose file lister...
[updated 10/5/21 to use lddtree rather than ldd]
--------------------- begin xtlist ----------------------
#!/bin/bash # # xtlist - list a text or binary file in a xterm window - 211004 # usage: xtlist "filename" # # Uses xxd for displaying binary files, minumum number of xterm columns # needed to display without wrapping is (hexbytes * 2.5) + hexbytes + 11 # Uses lddtree from pax-utils for displaying dependencies # Uses less for display, main controls are up/down arrow, page up/down, # home for beginning, end for end, q to quit (or close xterm window). # Less has a number of features, press h for help. # cols=90 # xterm columns (expands as needed for binary hex dump) rows=40 # xterm rows hexbytes=32 # xxd hexdump bytes per line textgrep=" text" # file output to determine if a text file exegrep=" ELF" # file output to determine if an ELF binary if [ "$2" = "doit" ]; then file -L "$1" | if grep -Eq "$textgrep" ; then ( file -L "$1" | if grep "," ; then # display type if more than plain # special case for misidentified BASIC source code file -L "$1" | if grep -q " source," ; then head -100 "$1" | if grep -Eqi "^rem |^print \"" ; then echo "(looks like BASIC)" fi fi echo fi cat "$1" ) | less else # list binary file.. display output of file command, if ELF file # also display ldd and readelf output, then list the file as a hex dump ( file -L "$1";file -L "$1" | if grep -Eq "$exegrep" ; then echo; echo "lddtree output..."; echo; lddtree "$1" echo; echo "readelf -ed output..."; echo; readelf -ed "$1" fi echo; xxd -c $hexbytes "$1" ) | less fi else if [ -f "$1" ]; then if ! (file -L "$1"|grep -Eq "$textgrep"); then # if not a text file xddcols=$[$hexbytes/2 * 5 + $hexbytes + 11] # calc hex dump columns if [ $cols -lt $xddcols ]; then cols=$xddcols; fi # expand as needed fi xterm -title "xtlist - $1" -geometry "$cols"x"$rows" -e "$0" "$1" doit & fi fi
--------------------- end xtlist ------------------------
Put it in the file manager's scripts folder or in a path
directory for command-line usage. If it's a plain text file it
just displays it in xterm with no extra output. If it's anything
other than plain ASCII with LF line ends then it displays the
output of the file command at the beginning. If it's not ASCII
then it lists the file in hex using the xxd utility (I didn't know
about that command, handy!). If it's an ELF binary also displays
the output of the ldd command. The script automatically expands
the terminal width as needed to keep the hex dump from wrapping.
5/3/20
My old Atari 800 stuff works with Ubuntu 20.04...
...from my new Etc Files page
I just added. The 20.04 repositories include a fairly recent
version (4.1.0) of the Atari800 emulator and it works pretty much
flawlessly. Can't say the same about Vice the C64 emulator, it's a
flashing stuttering mess (after finding and installing the roms),
even when running no software besides the C64 power-on screen.
Don't use it a lot but sometimes it's nice to play with the old
stuff, hope it can be fixed.
5/5/20 - Playing around with themes... MATE with Materia dark
compact controls...
...and with Adwaita dark controls...
Both with Numix windows and Ubuntu-Mono-Dark icons. I like the
Adwaita Dark controls better but there's a little glitch, when
renaming files it makes the text black on white but doesn't reset
the cursor making it invisible. Materia dark is ok, a bit blockier
in places (like the task bar) and not crazy about the dashed lines
in edit fields but can get used to it. Or learn to edit theme
files. I really like the MATE Appearance applet!
5/6/20 - Fixed the invisible cursor! While the intricacies of
editing CSS mostly eludes me, I found
a trick to make MATE's Caja work with some dark themes like
Adwaita Dark. I copied the "/usr/share/themes/Adwaita-dark" folder
into my home's ".themes" folder and renamed it to
"Adwaita-dark-modified". Edited the index.theme file, changed to
Name=Adwaita-dark-modified and GtkTheme=Adwaita-dark-modified.
Finally, in the gtk-3.0 folder edited the (tiny) gtk.css theme and
added the following after the import line...
.caja-desktop.view .entry,
.caja-navigation-window .view .entry {
caret-color: red;
}
Selecting Adwaita-dark-modified controls using the MATE
Appearance app makes the rename cursor red (instead of invisible
white), problem solved. In the original tip the color was #000
(black) but with black on white characters using a color makes it
easier to see the cursor. Presumably I can add other stuff here to
override the defaults, but figuring out what to add is tricky...
the GTK3
Theming Docs is a start.
5/8/20 - Figured out a little bit more...
.caja-desktop.view .entry,
.caja-navigation-window .view .entry {
color: blue; /* text color */
background-color: gray; /* highlight/select color */
caret-color: red; /* cursor color */
}
...but so far haven't figured out how to change the actual entry
box background color.. it gets... complicated. Probably
sub-classes or something. No matter, it's perfectly functional the
way it is.
Adventures in Unicode
I've been messing around with Unicode UTF-8 encoding for the Atari 800 stuff, those old
8-bit programs made fairly heavy use of embedded screen control
and character graphics codes so to make the listings halfway
useful when viewed on a PC I made a Atari-to-PC converter
program...
-------------------- begin at2pc.bas -----------------------------------
'Atari800 to PC converter 'Original from years ago, last mod 5/8/2020 'This is QBasic-style code, to compile with FreeBasic 'use the command: fbc -lang qb -exx at2pc.bas ' 'In mode 0 Converts 9B hex line ends to 0D 0A (CrLf) and 'reverse text characters converted to [Reverse] text [Normal] 'In mode 1 converts common control codes to [Clear][Up][Down][Left] etc 'and common box graphics characters converted to [.--][-.-][--.][|--] etc 'In mode 2 converts the control and graphics codes to similar-looking 'Unicode characters rather than the [] codes for a nicer listing 'Anything not recognized converted to [$xx] where xx is the hex byte ' programstart: ON ERROR GOTO ferror PRINT "Atari ATASCII to PC ASCII Converter" INPUT "Convert mode [0]none [1]text [2]UTF-8 :", a$ convertmode = INT(VAL(a$)) IF convertmode < 1 OR convertmode > 2 THEN convertmode = 0 again: INPUT "Input file :", inpfile$ IF inpfile$ = "" THEN SYSTEM OPEN inpfile$ FOR INPUT AS #1 : CLOSE #1 OPEN inpfile$ FOR BINARY AS #1 INPUT "Output file :", outfile$ IF outfile$ = "" THEN SYSTEM OPEN outfile$ FOR OUTPUT AS #2 Print "Converting... "; currentbyte = 0 WHILE NOT EOF(1) lastbyte = currentbyte GOSUB getonebyte currentbyte = byte IF byte = 155 THEN IF lastbyte > 159 THEN PRINT #2,"[Normal]"; PRINT #2, CHR$(13); CHR$(10); ELSE IF byte > 159 AND byte < 253 THEN IF lastbyte < 160 THEN PRINT #2,"[Reverse]"; byte = byte - 128 ELSE IF lastbyte > 159 AND lastbyte < 253 THEN PRINT #2,"[Normal]"; END IF IF byte < 32 OR byte > 127 THEN outstring$ = "[$"+RIGHT$("0"+HEX$(byte),2)+"]" ELSE outstring$ = CHR$(byte) END IF IF convertmode = 1 THEN IF outstring$ = "}" THEN outstring$ = "[Clear]" IF outstring$ = "[$1C]" THEN outstring$ = "[Up]" IF outstring$ = "[$1D]" THEN outstring$ = "[Down]" IF outstring$ = "[$1E]" THEN outstring$ = "[Left]" IF outstring$ = "[$1F]" THEN outstring$ = "[Right]" IF outstring$ = "[$FE]" THEN outstring$ = "[Del]" IF outstring$ = "[$11]" THEN outstring$ = "[.--]" IF outstring$ = "[$12]" THEN outstring$ = "[---]" IF outstring$ = "[$05]" THEN outstring$ = "[--.]" IF outstring$ = "[$1A]" THEN outstring$ = "[`--]" IF outstring$ = "[$03]" THEN outstring$ = "[--']" IF outstring$ = "[$01]" THEN outstring$ = "[|--]" IF outstring$ = "[$13]" THEN outstring$ = "[-|-]" IF outstring$ = "[$04]" THEN outstring$ = "[--|]" IF outstring$ = "[$17]" THEN outstring$ = "[-.-]" IF outstring$ = "[$18]" THEN outstring$ = "[-'-]" END IF IF convertmode = 2 THEN IF outstring$ = "}" THEN outstring$ = chr$(&hE2)+chr$(&h86)+chr$(&hB0) IF outstring$ = "[$FE]" THEN outstring$ = chr$(&hE2)+chr$(&h97)+chr$(&h80) IF outstring$ = "[$1C]" THEN outstring$ = chr$(&hE2)+chr$(&h86)+chr$(&h91) IF outstring$ = "[$1D]" THEN outstring$ = chr$(&hE2)+chr$(&h86)+chr$(&h93) IF outstring$ = "[$1E]" THEN outstring$ = chr$(&hE2)+chr$(&h86)+chr$(&h90) IF outstring$ = "[$1F]" THEN outstring$ = chr$(&hE2)+chr$(&h86)+chr$(&h92) IF outstring$ = "[$11]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&h8C) IF outstring$ = "[$12]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&h80) IF outstring$ = "[$05]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&h90) IF outstring$ = "[$1A]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&h94) IF outstring$ = "[$03]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&h98) IF outstring$ = "[$01]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&h9C) IF outstring$ = "[$13]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&hBC) IF outstring$ = "[$04]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&hA4) IF outstring$ = "[$17]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&hAC) IF outstring$ = "[$18]" THEN outstring$ = chr$(&hE2)+chr$(&h94)+chr$(&hB4) END IF PRINT #2, outstring$; END IF WEND CLOSE PRINT "Done." GOTO again getonebyte: 'return one byte in byte 'has to get 2 bytes from file 'then split between calls IF odd = 0 THEN GET #1, , inword% 'get 2 bytes iwv = inword% IF iwv < 0 THEN iwv = iwv + 65536 byte = iwv - INT(iwv / 256) * 256 'return low byte odd = 1 ELSE byte = INT(iwv / 256) 'return high byte odd = 0 END IF RETURN ferror: CLOSE PRINT "Error" GOTO programstart
-------------------- end at2pc.bas -------------------------------------
This worked fairly well, it doesn't handle all the possible
characters but converts common ones used for cursor control and
box graphics and certainly makes the listings look much better in
an editor. For all the thousands and thousands of Unicode
characters there seems to be nothing that can represent simple
inverse-video text so used [Reverse] and [Normal] tags to indicate
that. Even more surprising was the files look fine in Notepad on
Windows 7, was under the impression that Windows used UTF-16 but I
guess not. UTF-8 is a much better encoding scheme that retains
ASCII compatibility. Great, so copied the text files to a web page
directory and got was garbage.. web browsers don't interpret it
unless told. So after a bit of googling made another script...
#!/bin/bash # convert all *.txt files in current dir to UTF-8 HTML equivalent # with [name].htm extension removeorig=no for file in *.txt; do htmfile=$file.htm echo "Converting \"$file\" to \"$htmfile\"" echo "<head><meta charset=\"utf-8\"/></head><body><pre>" > "$htmfile" cat < "$file" | sed 's/</\</g' >> "$htmfile" echo "</pre></body>" >> "$htmfile" if [ "$removeorig" = "yes" ];then rm "$file";fi done
...almost worked the first time but the sed line was important, otherwise the < characters can really mess up the display. Now I have web-browsable directory of Atari source code that sort of resembles what it's supposed to look like. But that script can be dangerous to data if ran at the wrong time, especially if removeorig is set to yes. Not so funny story... I thought I'd just leave the script in that directory and just rename it from convhtm.sh to convhtm.sh.txt (or whatever it was called), thinking that when I needed to update the files I'd just rename it back and run it, otherwise would just list the code to the screen like .txt files have always done. Boy was I surprised [when the script ran instead of displaying... update 5-10-20 checked with FreeHostia tech support - they were awesome! - turns out this is default Apache behavior, "Files can have more than one extension; the order of the extensions is normally irrelevant". There's a good reason for this, it's so that multiple handlers can be triggered by naming a file with multiple extensions. Just never noticed this before as the previous server software didn't work that way. One fix is when posting scripts meant to be viewed name the file something_sh.txt then the .sh handler won't be triggered. Another fix is the .htaccess file, it's a way to alter server defaults. Something like this fixed the issue for good, added an .htaccess file to the root of my site containing those lines with .sh to the lists to globally disable scripts.]
Anyway... here's a more fixed up version of the txt2htm converter
script, this one just prints usage if ran without a parameter...
----------------- begin txt2htm -------------------------------
#!/bin/bash # txt2htm - text to html converter - 200509 # usage: txt2htm [--nocheck] [--remove] file(s) # does a minimal html conversion... # prepends <head><meta charset="utf-8"/></head><body><pre> # appends original file using sed to change < to < # appends </pre></body> # Converted files have .htm extension, overwrites existing files. # # uses the file command to make sure the file being converted is text filematch=" text" # that should always work but sometimes the file output is wrong # if --nocheck specified then bypass the file check if [ "$1" = "--nocheck" ];then filematch="" shift fi removeorig=no # the --remove option is useful for converting an entire directory # of files at once, for example: txt2htm --nocheck --remove * # use with caution! if [ "$1" = "--remove" ];then removeorig=yes shift fi if [ "$1" = "" ];then scrname=`basename "$0"` echo "txt2htm - converts text file(s) to minimal html file(s)" echo "usage: $scrname [--nocheck] [--remove] file [file ...]" echo "converted files have .htm extension added" echo "if --nocheck specified then bypasses file check" echo "if --remove specified then deletes original file(s)" echo "if both options specified then must be in that order" else while [ "$1" != "" ];do if [ -f "$1" ];then if file "$1" | grep -Eq "$filematch" ;then htmfile=$1.htm echo "Converting \"$1\" to \"$htmfile\"" echo "<head><meta charset=\"utf-8\"/></head><body><pre>" > "$htmfile" cat < "$1" | sed 's/</\</g' >> "$htmfile" echo "</pre></body>" >> "$htmfile" if [ "$removeorig" = "yes" ];then echo "Deleting \"$1\"" rm "$1" fi else echo "\"$1\" is not a text file, file output..." file "$1" fi else echo "\"$1\" is not a file" fi shift done fi
----------------- end txt2htm ---------------------------------
Determining the UTF-8 byte sequences for
the at2pc.bas program was tedious, found an online converter that
was fairly easy to use but still involved copying the Unicode
character and pasting it to the converter then grabbing the hex
bytes. For the next time I have to deal with this I wanted a
utility where I can type in a Unicode hex string and it gives me
the sequence with no messing around, so made this...
--------------- begin hex2utf8.blassic -------------------------------
#!/usr/local/bin/blassic 'hex2utf8.blassic - converts hex unicode to UTF encoded bytes - 200508 'prompts for hex string for a unicode character (often stated as U+somehex, 'don't enter the U+ part, just the hex) then prints the encoded byte 'sequence in decimal and hex and displays the character (if possible). 'Loops until 0 or an empty string is entered. 'This code is for Blassic, get it at: https://blassic.net/ 'To convert this to normal BASIC remove "label " from labels 'then can compile using FreeBasic using the -lang qb option. print "*** Unicode to UTF-8 ***" dim bytenum(4) label again: input "Unicode hex string: ",a$ n=int(val("&h"+a$)) if n<=0 then goto programend if n>55295 and n<57344 then goto badnumber if n<128 then goto onebyte if n>127 and n<2048 then goto twobyte if n>2047 and n<65536 then goto threebyte if n>65535 and n<1114112 then goto fourbyte label badnumber: print "Invalid code":goto again label onebyte: nbytes=1 : bytenum(1)=n goto showresults label twobyte: nbytes=2 bytenum(1)=192+(n AND 1984)/64 bytenum(2)=128+(n AND 63) goto showresults label threebyte: nbytes=3 bytenum(1)=224+(n AND 61440)/4096 bytenum(2)=128+(n AND 4032)/64 bytenum(3)=128+(n AND 63) goto showresults label fourbyte: nbytes=4 bytenum(1)=240+(n AND 1835008)/262144 bytenum(2)=128+(n AND 258048)/4096 bytenum(3)=128+(n AND 4032)/64 bytenum(4)=128+(n AND 63) label showresults: print "Decimal "; for i=1 to nbytes print bytenum(i);" "; next i : print print "Hex "; for i=1 to nbytes print right$("0"+hex$(bytenum(i)),2);" "; next i : print print "Display "; for i=1 to nbytes print chr$(bytenum(i)); next i : print goto again label programend: system
--------------- end hex2utf8.blassic ---------------------------------
Pretty
slick, can use a Unicode list such as the
one
on Wikipedia and get immediate results, plus verify that the
OS can actually display the character. It's fun just typing in
random hex strings to see what comes up...
I write quite a bit of QBasic-style and Blassic code
because it's easy, I know how, and I don't have to worry much
about the language changing.. especially QBasic but Blassic is
fairly stable too, I've been using version 0.10.3 since about 2010
when it was released. Just checked and other than a few warnings
it still builds and works on Ubuntu 20.04 and didn't have to
install any dependencies besides the minimal build stuff I already
had installed. [...]
5/11/2020 - Unfortunately the newer Blassic version 0.11 does not
compile at all on Ubuntu 20.04. It works on my old Ubuntu 12.04
system and seems to have a more stable interactive environment
(not that that matters to me but version 0.10 builds would
core-dump almost immediately if used interactively). Yes, handy as
Blassic is, moving forward I need a better "quick and dirty"
programming method, based on something that's current and
well-supported. And it's looking more and more like that something
(even for scripting) is FreeBasic.
I've used FreeBasic for awhile, not only can it compile
QBasic-style code mostly unchanged, the native language is quite
powerful with proper declarations, structured subs and error
control. I'm not a professional programmer in the usual sense but
often I need to make halfway reasonable programs for work, and
with FreeBasic I can recompile the same code I use for Windows
with only minor considerations like path separators and
interfacing with the OS.
Then I got to thinking... almost the entire reason I used Blassic
instead of FreeBasic for lots of stuff was because with Blassic I
could put #!/usr/local/bin/blassic on the first line and it just
worked.. no compiling needed. With Ubuntu and similar Linux DE's
when a script is clicked on it asks to run, run in a terminal or
display(edit). By contrast when a compiled binary is run there is
no such prompt, it launches without a terminal which for the kinds
of programs I use is almost never what I want, so in addition to
the compile step, I have to also make either a shell script or
create a launcher to run it. This is a bit distracting when all I
need is to make a simple program that prompts for input,
calculates and prints the results (the majority of my programming
needs). Blassic was the only BASIC I had where I could just type
some code into a file, make it executable and run it without
messing around with multiple files and all the hassles that come
with that (like making a directory for the files). Real QBasic was
a close second in the easy department thanks to DosEmu/FreeDos and
some right-click script trickery but wasn't as straight-forward as
Blassic and being DOS was limited to 8.3 filenames. So why not
just treat FreeBasic like it was a scripting language? So I made
this...
"fbcscript" - Scripted
FreeBasic
This is a bash script that lets QBasic-style and FreeBasic
programs to be run directly from the source code by adding an
interpreter line to the beginning of the program and making the
file executable like any other scripting language. It defaults to
Qbasic-style code but can use other FreeBasic dialects by
specifying the compile options on the interpreter line. Here's the
script...
-------------------- begin fbcscript ------------------------
#!/bin/bash # # fbcscript - make freebasic programs run like scripts - 200512B # # Copyright 2020 William Terry Newton, All Rights Reserved. # This file is supplied as-is and without any warranty. # Permission granted to distribute with or without modification # provided this copyright notice remains reasonably intact. # # Copy this to (say) /usr/local/bin (otherwise adjust examples) # At the beginning of freebasic scripts add the line... # #!/usr/local/bin/fbcscript # ...to compile and run qbasic-style code. # To specify fbc options add --fbcopts followed by options (if any) # To run native freebasic code with no options... # #!/usr/local/bin/fbcscript --fbcopts # To run native freebasic code with the threadsafe library... # #!/usr/local/bin/fbcscript --fbcopts -mt # To use the fblite dialect... # #!/usr/local/bin/fbcscript --fbcopts -lang fblite # # Normally it refuses to run if not running in a terminal so # compile errors can be seen and that's what's usually wanted. # To bypass this check include --notermcheck first, as in... # #!/usr/local/bin/fbcscript --notermcheck [--fbopts [options]] # This only makes sense for graphical programs. # # If the compiled program exits normally (generally via system) # then the temporary files are automatically removed, however if # the terminal window is closed before the program terminates then # the temp .bas and .bin files will be left behind in the .fbcscript # directory. FB programs are usually small so this isn't much of # a concern and it doesn't interfere with operation, plus it's # an easy way to grab the compiled binary if needed. Periodically # clean out the .fbcscript directory if desired. # # Note... the script file must be in unix format (as with any # script interpreter called using the #! mechanism) # fbcscriptdir="$HOME/.fbcscript" # directory used for temp compiles compileopts="-lang qb -exx" # default fbc compile options titleterminal="yes" # attempt to title window according to script name if [ "$1" = "" ];then # if called without any parms at all print usage echo "fbcscript - runs freebasic code like it was script code" echo "To use add #!$0 to the beginning of the program" echo "and make the file executable. Default fbc options: $compileopts" echo "add --fbcopts followed by options (if any) to change defaults." echo "add --notermcheck first to bypass the terminal check." exit fi bangparms="$1" # either script name or extra parameters if (echo "$1"|grep -q "^--notermcheck");then # bypass terminal check bangparms=$(echo "$1"|tail -c +15) # remove from bangparms if [ "$bangparms" = "" ];then # if that was the only parm shift # to put the script name in $1 fi else # check for terminal if [ ! -t 1 ];then # if no terminal then exit # only alerts if zenity installed, if not just exits zenity --error --text="This program must be run in a terminal" exit fi fi fbcopts=$(echo "$bangparms"|grep "^--fbcopts") if [ "$fbcopts" != "" ];then compileopts=$(echo "$bangparms"|tail -c +11) # grab desired options shift # remove added parameter fi scrfile="$1" # should be the full name of script file shift # shift parms down to supply the rest to the program if [ -f "$scrfile" ];then # if the script exists mkdir -p "$fbcscriptdir" # make compile dir if it doesn't exist scrbasename=$(basename "$scrfile") # get base script filename scrname="$fbcscriptdir/$scrbasename" # prepend fbcscriptdir to base name head -n 1 "$scrfile" | if grep -q "#!";then # if code has a shebang line echo "'" > "$scrname.bas" # keep line count the same for errors tail -n +2 "$scrfile" >> "$scrname.bas" # append lines 2 to end else cp "$scrfile" "$scrname.bas" # copy code as-is fi # compile the script... if [ -f "$scrname.bin" ];then rm "$scrname.bin" # remove binary if it exists fi fbc $compileopts -x "$scrname.bin" "$scrname.bas" if [ -f "$scrname.bin" ];then # if compile successful if [ "$titleterminal" = "yes" ];then # if titleterminal enabled echo -ne "\033]2;$scrbasename\a" # try to change the window title fi "$scrname.bin" "$@" # run with original parameters rm "$scrname.bin" # remove temp binary else if [ -t 1 ];then # if running in a terminal echo "----- press a key -----" read -n 1 nothing # pause to read error messages fi fi rm "$scrname.bas" # remove temp source else # bad parm on the #! line if [ -t 1 ];then echo "Invalid option: $scrfile" echo "----- press a key -----" read -n 1 nothing fi fi exit # changes... # 200511 - initial version. # 200512 - added error message and pause for bad #! option. # 200512B - separated out script base name and used to set the # terminal window title using an escaped print. Seems to work # but if it causes problems change titleterminal to no.
-------------------- end fbcscript --------------------------
To "install" it, copy from between the cut lines, paste into a
new file named "fbcscript", and make the file executable like any
other script. It is designed to run from /usr/local/bin but it can
reside anywhere that can be specified directly on the initial
interpreter line. For testing it can be placed in the same
directory as the test scripts then use "#!./fbcscript" for the
interpreter line instead of the "#!/usr/local/bin/fbcscript" used
in the examples.
Here are a few sample scripts illustrating how to use...
#!/usr/local/bin/fbcscript rem default for qbasic-style code print "Hello World!" print "Parm 1:";COMMAND$(1) print "Parm 2:";COMMAND$(2) on error goto notfound open "file1" for input as #1 print "file1 exists" goto progend notfound: print "file1 does not exist" progend: input "Press enter to exit ",a$ system
#!/usr/local/bin/fbcscript --fbcopts rem normal freebasic language dim a as string dim e as integer print "Hello World!" print "Parm 1:";command(1) print "Parm 2:";command(2) e = open("file2",for input,as #1) if e = 0 then print "file2 exists" else print "file2 does not exist, error = ";e end if input "Press enter to exit ",a system
#!/usr/local/bin/fbcscript --fbcopts -lang fblite rem freebasic fblite variation print "Hello World!" print "Parm 1:";command(1) print "Parm 2:";command(2) e = open("file3",for input,as #1) if e = 0 then print "file3 exists" else print "file3 does not exist, error = ";e end if input "Press enter to exit ",a system
The default is QBasic-style code which is compiled using the
options "-lang qb -exx" to enable "on error goto lable" error
checking. It has resume but as far as I can tell it doesn't work
quite like it does in QBasic, generally I don't use resume and
just trap and retrap errors as needed, the most common use is to
detect if files exist after entering a filename. To use other
FreeBasic dialects override the default options using --fbcopts,
if used by itself then no compile options are used for the stock
FreeBasic language where all variables and subroutines must be
declared first. With this dialect error handling is easier as
things like open return an error status number that can be used
instead of QBasic's trap spaghetti method. Use --fbcopts -lang
fblite for the "fblite" dialect which is a mix of the old and new
syntax.. don't have to declare variables, type can indicated by
suffixes but has the better open error handling and other newer
language features. This is explained in the
fine documentation.
Generally you'd want to run simple BASIC code from a terminal so
PRINT INPUT etc will work, plus need something on which to display
any compiler errors - FreeBasic is pretty good about handling
compile errors, it displays nothing at all unless something is
wrong, then it lists the offending line with a pointer to what it
doesn't like. Most popular Linux GUI's prompt to run or run in a
terminal when double-clicking an executable script, choose Run in
terminal. If compile errors occur then after listing the errors it
prompts to press a key before closing the terminal window. If you
forget and click just Run by default it pops up a Zenity error
dialog saying the program must be run in a terminal, this is to
prevent situations like a program running in the background
waiting for input that will never come and having to use killall
or other means to terminate the hung task. Sometimes this is not
desirable, for example programs that don't do any visible I/O or
graphics programs that draw their own window. To disable the
terminal check add --notermcheck as the next parameter after the
fbcscript interpreter command and before any other options. For
example...
#!/usr/local/bin/fbcscript --notermcheck --fbcopts -lang fblite rem graphical variation screen 12 print print " Hello world in a graphics window" print " Press a key to exit" line (10,70)-(630,70),15 line (630,70)-(630,470),15 line (630,470)-(10,470),15 line (10,470)-(10,70),15 while 1 a$ = inkey$ if a$ <> "" then system x = int(rnd(1)*600)+20 y = int(rnd(1)*380)+80 c = int(rnd(1)*16) pset (x,y),c wend
...with static. For debugging can still run the script with a
terminal to see compiler errors.
The files created by fbcscript (not counting files that the
program itself makes) are written to the directory ".fbcscript" in
the home directory. For each program ran it creates the file
[basename].bas containing the FreeBasic code with the #! line
stripped out (and replaced with a single comment line to keep line
references the same), and (assuming there are no compile errors)
the file [basename].bin containing the compiled binary, plus
whatever temp files are created by fbc during the compile process.
Normally when a program terminates (via the system command) the
fbcscript script removes the temp files (but not the temp
directory), but if the terminal is closed before the program
terminates then the files will be left behind. Other than building
up some file cruft this causes no problems, the next time the
program is run it will overwrite its temp files if they exist.
FreeBasic programs, especially adapted to script-style usage, tend
to be small.. kilobytes not megabytes.. so compared to other file
wastes in a typical system this is fairly minor. I did it that
way, and used the basename of the script instead of a single name
to avoid possible issues with running multiple programs. The fbc
compiler is fairly fast but not sure how happy it would be if the
source it was compiling got yanked out from under it before it
could finish. Once the compile is done then there's no problem
running multiple instances of the same program so long as they
aren't all launched at about the same time.. the only effect is if
running from a terminal the rm errors become visible if something
else deletes the temp files first - Linux does not care if a
running program is deleted once it has been loaded unless it's
something tricky that accesses its own binary or something (yeah..
don't do that).
The compile and run process does not change the script's current
directory and all parameters passed to the script are passed on to
the running program where they can be accessed using command(1) or
command$(1) etc. Quoted parameters stay together and there's no
problem with spaces in the script filename. One gotcha is the
script must have unix-style (LF-only) line ends or it throws an
odd error message, FreeBasic itself doesn't care but this is how
the scripting #! interpreter system works - if there's a CR on the
interpreter line it adds it to the command line parameters with
not useful results. If there are no additional options then it
causes a "bad interpreter" error and if passed to the fbc compiler
it outputs a badly formatted error message. I use the "flip"
utility to flip text files between unix and dos format.
5/12/20 - Just One Little Problem - it didn't update the terminal
window, which remained a useless "Terminal" or no title at all.
That just.. bugged me, the script was near perfect for what I
needed besides that (really helps to see something in the task
bar). A couple rabbit holes later I stumbled into the trick, which
ended up being much simpler than I was expecting...
echo -ne "\033]2;Window Title Here\a"
That outputs [ascii 27]2;Window Title Here[ascii 7]. Everyone was
doing complicated PS and PROMPT_COMMAND tricks to maintain a
particular title at a command prompt, but I just need the title
when the program is running. The should-be equivalent PRINT
CHR$(27);"]2;titlestring";CHR$(7); does not work from FreeBasic,
nor do ANSI screen codes. A workaround is to get the OS to print
the string.. SHELL "printf '\033]2;Hello World!!!\a'" works.
New Stuff!
5/19/20 - My web pages here are now accessible using the encrypted https protocol. It's been on my mind for awhile, but wasn't in that much of a hurry because there are no logins or anything like that here. I know.. without encryption a bad actor can alter the content in route, which to me always seemed unlikely - one would have to work for an ISP or otherwise have access to the raw data stream to do something like that and why would they bother with a target like my hobby stuff? As it turns out, some ISP's are plastering ads on any unencrypted content they can find. That's different! Another reason for dragging was the process was complicated (for me) and assumed access to the web server, something someone on a hosted platform does not really have beyond configuration files and whatever the control panel lets them do. So got to poking around and searching the net, and stumbled on a "Request a Let's Encrypt certificate" link under the domain options. After a few hours for the DNS records to update, https worked. Yay. There's a way to make it automatically redirect http to https but didn't do that, almost all internal links here are relative so once on https it should stay there.
I (finally) ordered a new computer... a ZaReason Limbo 9200a with
a 6-core 3.2ghz AMD processor, 32 gigs of ram and 8 terabytes of
disk storage. As somewhat of an old-timer just writing that
sentence is almost surreal, especially considering that these days
those aren't particularly impressive specs.. just (other than disk
space) a lot more computing power than I'm used to. As with my
current system (or used to until the drive failed), the 4T disk is
for internal backups, the rest is split between a 1T drive, a 1T
M2 SSD, and a 2T drive. Hopefully they'll put the OS (Ubuntu
20.04) on the SSD with /home (and maybe /var?) on the 1T drive..
didn't leave any notes with the order figuring they likely know a
lot more about it than I do.
6/2/20 - Got the new system, continuing on the More Ubuntu Stuff page.
4/6/21 - Colorized the code on this page using VIMconvert.
Terry Newton (wtn90125@yahoo.com)