I require one common routine which:-
Implementation contains further documentation:-
begin 644 transpose.c.gz
M'XL(`/-D<ED"`^V::V_;-A2&O_M7$"@PV$[2B(SF&DM3H.F&!5BR`6DS8!CZ
M@99IBR@EJB(5-UCZWW>HBW67+"]I"BQN69GBX<MS>'DL'/5XBLZY1E=4A_P+
M^A!27P52,?1'P$*JN?05FAZ/CJ=H_&Y"+/P*?7`9>@>W9:AYY+TTK::9^YKY
M2[9$*QFB!=='"^JON;]&BFF%Y`HQOG8UU$).!5(Z9-13+U-M*I1$D6*K2,3]
M`T%]&B)'>D'(E`(OD&:.Z_//$5/;(8T[UBMR`FHTU*E4>M-&BMZRV/(%7X%C
M*_3[^?E?HQ?PC?LLKJ#YZ`7XS%=)`!IQA9A_R\UX$(AVJ49Z.R$;+@22CA.%
M$"L*0NF`9^`K#1V7@W<Z`E?!2KLH9&NN-`NW\<E`<X\K$.5>()C'?)W.+4P,
M%6L90C</I)@1ON4PCUE7]L41T=),I&9>($,:WJ%;"I.X$$P=HOET;F8;><GZ
MY>[";*J?CK+Q$<+3F1U;9LZ]S-O(](24VE2AT9[B66MC-GRM3;LPF?`W@)7A
M3B1H*.Y,\+!KJ*_1QF4A*[KW]OH*=E;D+;()4*#=X=7;/Z]S2SQK];$R/]7I
M7TDAY`91L]4\N5W6HF=5!84^1Q`4^+\TFU["!,'$"NE\*OJG-\4&L\R@H>(5
M5AL:!/D"@S&CCINIF@%4M#A:\G@;)$.0*:D/$?<**`^-NK$P/9NUM_T;U?$4
M]Z@;BXIZ<N:#2&_G"EV]/T=MGTO3EM@='!T==9;4[AZ]@C*#\B,4&\H)%`(%
M0[&@/.L]ZSV27LPP`U/X.?)B6&SWN;$I?FKUK3_E3[W^$'97$G[F&NPNI%@6
M[7Z6&W\7O4NVT@_I7]WN:>>O.%^=>C=!2:\XGT6[Z_BQYO'6=\_YBY^J5N;W
M9,5#I=OV\N,SNUI_UNO6JS+KOS+Q?Z2W`[-;KY4S>)]RHE[/[!(>W*=<K=>K
M>@E7Z_6A_K7Y56-)S*_<KVH]LTOXE?>OUI_G[YO-7\YLQ1SI+[\]M-NN3ZV7
MP;3M^M1Z&;3:KD^ME\&T[?J$>CW0WLV?"W.0.ZY#]:YZ-)_]>_:OL']EI!\_
M$=)5ANK->LH^/.TJ0_7LGK(/3[O*4#W24_;A:5<9JF?UE'WX'&>59?Y60-!P
M#8\JVJ6^R9$64Z8;&<%#$CQ8A1N7"0_16\J%25CG^5EDDJTR,(GAMEQZ(1FY
MIMQ''O>Y1P5:,)^MN`8%AZ'%G6:("K[V35;7)"FIUF"=YT#ATYB,UF'$3![:
M1T+*`$5^*(4P>78*3UY2NQ!;FK9/`XZS\RPTOU.YNGE;0:,EEX?(HW=HPY6+
MM,Q2TQ"OHZF_CL-LG$45.2ZBE80S3O+>GR,(3-\A14W6NIQVQW:_R:S?9-YK
M0JQ^DTY?8.5@,B%JAR:O)(ZG:<\TYV[Z*I>O3(I=?5+)BY9%_`:CF,B.?',C
M62S8(]D$9M;Y*Q:ZH7?U?F:CQ'GX!-9=_4:WDB_SY9I[:C&.?`5[#*P<%Y9R
MZED3],_(B!=O"^M0X%/SPLF\;[K^Y=>9O7WA=/7V_6_6);(6U@X?#)^VZ\WE
M95GT`D2[.NPR8$WT)O6T2[AOT*HH;@[?&!=+=WM--`E_B$1_^#@)O]RM+M+=
M7A4EI?!Q^F=@O28:A[^?5FOX)`X_M=H:#ZP;T<(!$1+0&O_#@=,>_MN\@3V>
M?SQM-]+6H<:F'2`[%M:9=2JLUTDW^'9PD!Q!9,2$]1&:XUILC(TQ3HPQ?,N-
M,_/[LW'SL!//@O;I_$#@CZ]?@]34J$P2\:^CI/1XI*VS9)0?DF.?=-:X=/?B
MM.C]N-AT,[D?:PM&'QOQ`SR9VA-S"[]Y4[IUNH\[N-$=W.X.KKM#ZNZ0/=TA
MC>Z0=G=(W1U<=P?O[$[_ALDW1.HB#-2T+UXPH5@!_R>D&?]=X!S.]N&,'0[0
MX70<CK[A7!L.K2)LBARRRQSJIH\]A#ZDESXY<^QAS(G].$N^MF[F>67\[/"9
MH4ITRD_@0=(QM[@X+3H?GX-Q1>0B.X!)T)-)N4NNF/6LCG&9'>JR0'?PWP_A
M[.^+<'8WX4@CX>QD?4K/M)-QC7B=R,.S?N3U(ZX):4,1U@^@?N#T`Z9(#>7*
M,$<**2,E:6R'"AD"%;L7*O%P*5;(,*R0'"ND=0?9^V+%?G"LV#6LV/MCA3P(
M4W$O4W%_\+@<O#V8J;@<O+U;\-\/Q$@WQ.Q&B)'](=;&K79>M7.JG4\-7.KE
M42W3T,X1G&]@O#/_S5);[:?7VFXNTG1ZK7P#6SN>WHIBUK,ZQM#3B_=%E]5^
M>JT&K.#^X#M.KU5#5^,80T\OWA==);])2_`Y5DA_\*0<_+PQ^'DM^,(8Y>#G
<U?,:_Z_M\@6AD.DH]$]'7T>C?P&?0(9LUBX`````
`
end
Bit matrix transpose algorithm is its own inverse. So, test performs two transpose operations. Test input uses a marching bit test which is a more thorough version of a walking bit test:-
begin 644 transpose-test.c.gz
M'XL(`/QD<ED"`XV1P6[;,`R&[WX*SD$`Q342V6[1`G%V2"^[M+ODL&'+`-66
M$P*2;$C*L*'HNU>T/3?9>I@.$L6/XO^#6B6P10\/PEO\!3LKC.M:)^%S)ZWP
MV!K82>?AD[!&.@?)*EHEP.X7.<]N87>4<-\:UUJ/)[TD2AAUIZ26QCO0PE9'
M-`=X"B*>.C6M[2]Z4/238OM'<>I#&OPV+\!Y87V?G:&IU*F64#I?8[L\?KQ(
MV2!UD8NG_LLJIN>-J64#C]OMUV@6(C2RO\!=-).FQB:*T)`Y-(P"80]56AV%
MA20)\<\%/$<`)^/P8&0-/='7WZC%/M4W0[#^IT3Q5&6IRE-5K*-`M=1.>J:O
M4Y[2D\7Z+'GSE@S9,"^F\@U?J[S,PW9U-7@8248D*ZD^G&]PQ)PP'S$_QT"^
M5;;_L6%962K>6QB!U%7WFXP$@Y.]84WSO-/N*92<H5ZP(,%B%"PN!0&Z\$&^
M8?&<YU]BFI<J]F<=7J*_"[^;^/_$L6%D6W<TU."\M_UAP]_7;P0J$![8'-,Y
?+H)*_S_\'2_#2?L+?8>5_F0-H\IP?P5])VKL/@,`````
`
end
After more extensive testing, code requires something akin to:-
#ifdef __avr
#define REG16 1
#endif
#ifdef __arm
#define REG32 1
#endif
(This is the 35th of many promised articles which explain an idea in isolation. It is hoped that ideas may be adapted, linked together and implemented.)
I'm working on a 3D surround sound speaker array as part of a larger project. I am also expecting a delivery of 13 Watt quadraphonic audio amplifiers for a tenuously related project. I have immediate tasks to complete:-
This is very likely to take more than three weeks. I may be off-line for the majority of this period.
(This is the 34th of many promised articles which explain an idea in isolation. It is hoped that ideas may be adapted, linked together and implemented.)
In the book: Waldo by Robert Anson Heinlein, a 1950s style workshop (lathe, hammer, saw) is used to make a 1/4 scale matching pair of three finger mechanical hands. This requires the tedious work of drawing wire, winding motors, constructing capacitors and suchlike. A 1/4 scale set of tools is also constructed. Using the hands and the appropriate size tools, 1/16 scale hands can then be constructed. Likewise for 1/16 scale tools. Then, 1/64 scale hands and tools can be constructed. (Incidentally, this process requires very little material. For example, 1/64 scale hammer handle requires very little wood.)
A stable work environment for an Arduino is an analogous problem. While it is possible to install Arduino development software on any system, this is a haphazard approach which can fail at inopportune times. There are also the problems of security and reproducibility. My solution to this problem is pragmatic but far from ideal. Ultimately, chips used in all Arduino designs are proprietary. Deployment of code onto these chips uses avrdude for AVR architecture or bossac for ARM architecture. These handle architectual quirks, such as EEPROM partitioning, EEPROM privileges as seen from the running code and recovery from failed programming cycles. I also assume conversion of ELF blobs to raw binary occurs at this stage but I have seen no mention of this process. Regardless, all of this occurs over a virtual serial port over USB which itself is poorly defined. The implementation may involve an FTDI serial adapter or functional equialent and therefore safe and continued use is mutually exclusive with Microsoft Windows drivers.
For development of a speaker array, the solution to this mess is as follows:-
The advantages of this arrangement are numerous:-
Now, I'm beginning to burn in my new laptop, and am starting to set up the software on it. This process is proving not quite as trivial as I hoped it would be, the main problem being the Galago's HiDPI display. It seems that a lot of applications make assumptions about screen sizes that break stuff. The next major application that suffers from problems due to the high screen resolution is Emacs. I enabled desktop scaling in System Settings > General, and that manages to fix almost all the important apps, but it does something unexpected with Emacs. With desktop scaling on, Emacs expands to fill nearly the entire screen, saying that its window geometry is only 80x20. Attempts to set Emacs' window geometry manually to something reasonable via the -geometry command line switch, or in .emacs set-frame-size or in default-frame-alist doesn't help. Either results in a smaller Emacs window briefly appearing before the window again explodes to an irritatingly large size. This had me stumped for a while, until I realised that it had something to do with desktop scaling. Turning off desktop scaling results in a reasonably-sized window. Eventually, some judicious searches turned up this link, and I found a useful workaround by adding env GDK_SCALE= to the launcher command. Most standard apps are okay, but some others need special settings to be usable on HiDPI displays.
Pale Moon also seems to be only partially scaled. Many display elements such as scroll bars, checkboxes, etc are very small, but I can live with that for now. Gimp and Inkscape both still have very small buttons that didn't properly scale. As I install more and more apps it seems far too many of them seem to make the assumption that the display isn't going to be much more than about 1280×720. This should change with time but right now is a bit of a pain.
(This is the 33rd of many promised articles which explain an idea in isolation. It is hoped that ideas may be adapted, linked together and implemented.)
I'm working on a 3D surround sound speaker array. I received a clone Arduino Due about a week earlier than expected. This has put me in a moderate panic in which I have a depreciating asset which is not working. I've been jolted into resolving loose ends, such as Voltage levels, API and object code deployment. From personal experience and with sincerity, I recommended acquisition of a vintage round tuit. Well, I'm going to consider an Arduino Due as an even more effective Rectangular Tuit.
This tuit is about the size, power and price of a Nokia 3210. Unlike many credit card computers, it does not require a storage card during operation. Instead of screen, keys and radio, there are about 50 I/O lines. Operating system overhead is optional. Therefore, it is expected that timer interrupts at 48kHz will be the highest priority interrupt (or second highest if USB communication takes precedence.) 84MHz divided by 48kHz is exactly 1750 processor ticks per sound sample. Within this period, a processor must:-
It may be useful to vary this process for 192kHz monophonic sound or suchlike. Regardless, with available processing power and I/O, it may be possible to drive 64 speakers and/or multiple sets of headphones. Each set of headphones may have accelerometers and/or Hall effect devices for the purpose of maintaining Ambisonic sound-stage position.
Unfortunately, timing is complicated by the Arduino API which only permits timers to be specified to the nearest microsecond (rather than nanosecond). Therefore, it may be easier to play sound in multiples of 50kHz. The alternative is patch the Arduino libraries or implement a system which plays samples slightly too fast. The pitch shift would be approximately 0.2% but I find it less tacky to omit or repeat 1:600 samples or so rather than endure a pitch shift.
There are further limitations. The tuit uses a different instruction set to other Arduino boards. Support is greatly restricted. The Jul 2017 release of Raspbian doesn't work with the tuit. Nor is it available from an Ubuntu Linux desktop which was configured to install everything. However, it appears that using a hateful IDE is definitely optional.
Typically, micro-controller development uses an IDE. This may be Eclipse, Atmel Studio, Arduino's IDE or something else. I vastly prefer to be unconstrained by text editor. This should be the norm when using serial line programming (or virtual serial port programming tunnelled over USB). However, it is now custommary to have configuration within an IDE to specify virtual serial port name and invoke an external chip programming utility from a menu item.
For Atmel AVR chips, avrdude appears to be the default choice. For Atmel ARM chips, bossac performs the same function - or not if distributions don't have the configuration. Either way, I'd prefer to perform these steps via a Makefile. In addition to simplifying deployment, it fixes problems with deployment speed.
(This is the 32nd of many promised articles which explain an idea in isolation. It is hoped that ideas may be adapted, linked together and implemented.)
I'm promoting a set of ideas which rely upon multi-media and gain maximum value from ease of access to third-party multi-media. I would like to see people fairly rewarded for producing high quality, factually accurate content. Unfortunately, by mentioning "fair", "quality" and "factual", I'm already in a quagmire. Without resolving these issues, I'll extrapolate content distribution which is likely to occur with or without the legal permission of content producers.
Although patents and copyright should be considered separately, I have one patent pending with a failing nation state. Also, I believe that it is pertinent to mention that my beliefs differ from law. I believe that privacy should have more emphasis than free speech. I believe that rights should be extended to animals. This includes privacy. I believe that copyright, work-for-hire and privacy can be aligned towards maximum collective benefit rather than historical levels of effort. I assume that I'm a pragmatist who can overcome some bias to understand people who don't follow law or social norms.
I hope that people understand the distinction between theft of physical property and the "theft" of information. The former may include physical violence or may otherwise incur permanent injury or risk to life. The latter leaves people (mostly) with the same bit patterns but may incur loss of privacy and/or loss of expected future revenue. Physical loss and data "loss" can both be devastating but people affected by "theft" are very likely to be content rich, physically secure and unaffected by an illusory social safety net.
It should be obvious that large-scale audio copying preceded video copying because audio generally requires less resources. However, a friend and I were quite curious about a potential rise in book copying. Historically, the easiest method to copy a book was photocopying and this was generally more expensive than buying another, legitimate copy of a book. Therefore, photocopying was generally restricted to sections of a book, rare books or matters of urgency. However, technology has changed and it is now possible to copy thousands of books per minute.
My friend, an author, was concerned that discs with 20000 books were openly advertised on EBay and elsewhere. I assumed it was only books that were freely available. My friend was less certain. I was asked to investigate. So, I sent about US$5 to a relatively reputable party and recieved a disc. Indeed, it had about 20000 books and, indeed, it was a smattering of text files with Gutenberg legal notices. However, some of it was certainly violating copyright. I gave the disc to my friend without taking a copy. (What am I going to do with 20000 fiction books? Read them all?)
My friend has now come to the conclusion that it is a sign of merit to be included in a general corpus of literature. Indeed:-
You do anything in the world to gain a reputation. As soon as you have one, you seem to want to throw it away. It is silly of you, for there is only one thing in the world worse than being talked about, and that is not being talked about. -- The Picture Of Dorian Gray by Oscar Wilde.
However, a major question remains. Why did book copying gain popularity after music and film copying? It is large *sets* of books and therefore the volume of data exceeds the average film. From minor extrapolation, there will be a subsequent round of music and film copying but each will be a set of media. So, expect stuff like 1960s-Motown.zip followed by French_Noir_(Subtitled).7z and finally Terran-Culture-(1650-2050).tar.xz
However, what happens when those fine pioneers of the digital frontier run out of third-party content? Will they live-stream themselves at increasingly detailed resolution? Or will they find other stuff to distribute? We've already seen US-Diplomatic-Cables-(1966-2010) (and the resulting chaos). Will we see US-Tax-Records-(2010-2022).zip? Maybe Medical-Records-UK-NHS-2035.zip? DNA-From-Iceland-2042.7z? What havoc will this cause? Will we see the Helvetican War where lawyers, bankers and sociopaths are hung from lamp-posts? Will this be followed by bio-engineered genocide? I don't know. However, I know that:-
(This is the 31st of many promised articles which explain an idea in isolation. It is hoped that ideas may be adapted, linked together and implemented.)
We've come to the end of computing as originally envisioned by Xerox, a photocopier company and, to quote one end-user, our tech is shit. Furthermore, it is directionless with the exception that we use science fiction as our template to a greater or lesser amount. Firefly and The Expanse is a more pragmatic outlook than StarWars or the utopia of StarTrek. But what gives us best bang-for-buck? How do we get from here to there while avoiding a deluge of foreseen consequences? I see a way which is broadly consistent to the point that I encourge over-use to improve portability. However, it infuriates that similar solutions aren't being widely sought.
The approach is pretend that almost every technical development from Eternal September to present didn't occur. That would be an era without top-posting, PDFs or web browser DRM. It is an alternate path where Windows didn't dominate. If that scenario actually occurred, I doubt that vertically integrated companies, such as Commodore, Apple or Sun, would have been a better outcome than the Microsoft monopoly. Regardless, we have the hindsight to pick and choose from a rich history (while being very careful about recent developments) and then apply it to the current scale of integration - or anything simpler.
Currently, consumer and small business computing is meeting in the middle of Windows and Linux. Computing has become laughably insecure and it is getting worse. A chancer like John McAfee can suggest a "Privacy Phone" with physical switches to I/O. This is intended to be a stop-gap solution to improve security. However, look at the placement of the switches. They've never made anything close to a mock-up but none of the established interests dismiss it because they have no better solutions (and nothing to gain from highlighting the current situation).
Privacy looks like an illusion; a temporary aberration in a narrative from tribes to cyborgs. That's how major corporations and nation states like to portray the situation, especially when they have unwarranted access to servers, desktops, laptops, tablets and phones. The official recommendations of the 1988 Internet Worm were never heeded. (Hey, what happened to the author of the first Internet Worm? Did they put him in jail for 60 years? No, the Ivy League son of a spook became a venture capitalist.)
At a minimum, we should have five of everything in common use: processors, compilers, operating systems, databases, application suites. We are failing miserably. Rock's law may impinge on hardware but we have no excuse in any other category. And if we believe that North Korea attacked UK government infrastructure then we are in a theater of war where it is advantageous to have the low ground.
We have barely enough infrastructure to co-ordinate and disseminate sustainable solutions. This would be solutions which don't empower a nation state and don't enrich a Silicon Valley oligarch. This is difficult but I'll try.
I envision components which are mostly software or can be implemented as software rather than hardware. Components can be used in quadcopters, robots, hi-fi, sensors and actuators. This is technology which should be Christmas light divisible: it should do something sensible or it should be open to modification. Do you remember when Spock re-wired his flip-phone to get extra range? Or when Geordi LaForge gets rescued because Federation technology inter-operates so well? That's Christmas light divisible - and we should aspire to this because there are no silver bullets which cover every case.
Consider an environment of isolinear relays and LCARS terminals. (I'd love a system which is functionally equivalent to LCARS. By happenstance, I found that oversize text is an emergent property of one fairly efficient implementation.)
It may be in a mundane environment which waters your plants, feeds your fishes or brews you beer. You wouldn't object if surplus bandwidth provided perimeter security. However, you would object if each device cost US$50. So, how do we make this secure and cost effective? Cut the RAM and processing power to minimum. This also makes it economically viable. We compete on our terms by getting closer to the theoretical optimal solution. The pay-off for halving RAM, I/O hardware or processing power is huge. The goal is to have one or more implementations which are good enough to maintain a network effect of inter-operability. If we consider ITRON, there is the outside possibility that billions of devices may be manufactured over decades. (ITRON is for micro-controllers and is a portable interface like POSIX but is concerned with timers and event buffers rather than files and messages.)
I've tried really hard to not be a greenfield techie. I tried to graft more innovation onto the ediface of contemporary computing. However, it is becoming increasingly difficult and may be doing all of us a dis-service.
I spent about eight months attempting to compile software securely and efficiently. It is a lost cause. It is possible to re-compile one component if every other component remains stable. However, coupling between components is far too tight. This has two practical consequences. Firstly, it is not possible within one lifetime to solder a trustworthy CPU and I/O devices, run an audited kernel, run audited applications and access the majority of the contemporary Internet. NAND To Tetris is a rigged academic exercise which should be best practice but is wishful thinking. Secondly, there is no meaningful software provenance. When software components directly rely upon dozens of other components, a succession of re-compiled components may create hundreds of additional references to software versions. A global re-compilation of an open source operating system may extend the chain of provenance by hundreds of thousands of references. We cannot trust our processors, our kernels, our compilers, our text editors, our web browsers or anything else - and our collective chain of trust is hopelessly lost in the mist of time.
It is tempting to discard everything and follow NAND To Tetris rigidly. But what does that achieve? You'll have something which is slower than a Commodore 64 and you'll be the only user. Instead, apply Christmas light divisibility. What was the most worthwhile development from Eternal September to present? 23 years of popular culture: film, music and literature. Forget about Solaris, ICQ, Flash animation, Kazaa and Windows Vista. Instead, remember Bourne, Bond and Batman. Oh, and people built up a huge pile of proprietary documents and hyperlinks.
Rather than treat legacy content as a first-class case, treat everything which is secure and verifiable as the first-class case and treat all of the whiz-bang, high bandwidth multi-media as a second-class case. It would be great to provide 100% downward compatibility but cases such as Flash are unviable. You'll be glad to know that I've considered a large number of other cases and the vast majority can be salvaged. It requires one or more user interfaces which provide:-
This may be based around an interface like Kodi - or Kodi itself. We can also run a subset of this functionality on a secure console.
Trump says anyone would collude, but in the 2000 election, I called the FBI
In September of 2000, when Al Gore and George W. Bush were tied in the polls in their race for the White House, I was preparing to play Bush as Gore’s debate prep sparring partner. What happened next ended my role in that campaign, but serves as a contrasting precursor to events in a presidential campaign 16 years later.
The day before our first practice, I received a package of materials from an anonymous source that contained several VHS tapes and “debate materials” and a letter indicating that more documents where on the way. Naturally, I popped in one of the VHS tapes.
The minute I saw George Bush dressed in shorts practicing for a Tim Russert style interview, I knew my role in the Gore campaign was over. The hundreds of hours of preparation studying public tapes of Bush, reading volumes of briefing books, practicing speech patterns and phraseology even at the dinner table, much to the chagrin of my family, was utterly wasted. I stopped the tape after about 15 seconds, picked up the phone and notified the Federal Bureau of Investigation and immediately recused myself from the campaign.
Today 2017-07-19 inside Japan's crippled Fukushima nuclear plant at the primary containment vessel of the Unit 3 reactor. A marine robot with the nickname the
"little sunfish", is on a mission to study damage and find resources such as fuel that experts say has melted and mostly fallen to the bottom of a chamber. There's a picture of the robot too.