Skip to main content

Posts

Showing posts with the label Apple

MarsEdit and Blogging

MarsEdit (Photo credit: Wikipedia ) Mailing posts to blogs, a practice I adopted in 2005, allows a blogger like me to store copies of draft posts within email. If Blogger , WordPress, or the blogging platform of the moment crashes or for some other reason eats my posts, at least I have the original drafts of most entries. I find having such a nicely organized archive convenient — much easier than remembering to archive posts from Blogger or WordPress to my computer. With this post, I am testing MarsEdit from Red Sweater Software based on recent reviews, including an overview on 9to5Mac . Composing posts an email offers a fast way to prepare draft blogs, but the email does not always work well if you want to include basic formatting, images, and links to online resources. Submitting to Blogger via Apple Mail often produced complex HTML with unnecessary font and paragraph formatting styles. Problems with rich text led me to convert blog entries to plaintext in Apple Mail ...

Software That Feels Wrong

Original 1984 Macintosh desktop (Photo credit: Wikipedia ) You look at the screen. You wonder what is wrong. The program or app does what it should do, but for some reason you don't like to use the software. Something feels wrong  with the application.  I have been trying WriterDuet ( https://writerduet.com ) and for the longest time I couldn't pin down why I didn't like the application compared to Final Draft or Screenwriter. Technically, the program does what it should and has some excellent collaboration features. But I don't enjoy using it.  I used to love Screenwriter ( http://www.screenplay.com ) and eagerly await version 6.5 for the Macintosh. Version 6.x has felt like a partial port (it is) to OS X and macOS for some time. I know the problem is that the "widgets" used for the user interface are not Apple's widgets for the current operating systems. It's slightly annoying, but I still like Screenwriter. In my ideal world, 6.5 is ...

Loyal, but Frustrated Apple Fan

English: The logo for Apple Computer, now Apple Inc.. The design of the logo started in 1977 designed by Rob Janoff with the rainbow color theme used until 1999 when Apple stopped using the rainbow color theme and used a few different color themes for the same design. (Photo credit: Wikipedia ) Apple needs a revamp. It has turned into a big phone maker, with little side hobbies in computing, software, and television. Sure, by any metric, the computing and software side is huge, but these feel like afterthoughts at the current Apple Inc. Apple Computer is no more, I realize, and the computer world today is nothing like the 1980s or even 2000, when a desktop computer was necessary for basic work. But, someone has to code and create content. To create content requires a big, powerful, computer. I have some suggestions for Apple, which are unlikely to be read. Spin off the software so it becomes the primary focus of a stand-alone company or two companies. In fact, two is better...

What are the "Digital Humanities" Anyway?

When I read academic job listing for "Digital Humanities" the skills range from HTML coding to video editing. Some list audio editing. The jobs are so varied that you cannot pinpoint what the phrase means. Is my doctorate in rhetoric, scientific and technical communication sufficient? Often it is not. Some posts suggest an MFA or Ph.D. in media production. Starting January 2016, I am going to be working towards completion of my MFA in Film and Digital Technology. This feels like a last-ditch effort to revive my academic career, while also giving me more credentials to support my creative writing. With or without an academic revival, I'll benefit greatly from the courses and the exercise of creating and editing digital works. One of the frustrations I've had on the job market is that nobody seems to know what the "Digital Humanities" are or how to prove you have the skills to teach the courses. My age and my experiences are a serious obstacle on this...

You’re the Hero with Interactive Fiction

Zork I cover art (Photo credit: Wikipedia ) Visalia Direct: Virtual Valley July 6, 2015 Deadline August 2015 Issue “This is an open field west of a white house, with a boarded front door. There is a small mailbox here. A rubber mat saying ‘Welcome to Zork !’ lies by the door.” These familiar words, which I once read on the blue screen of a Commodore 64, now appeared on my iPhone. Considered one of the first dozen computer games ever developed, Zork has a special place in computing history. Zork launched what is known as interactive fiction or text adventures. In 1977, four programmers working in the MIT Laboratory for Computer Science created the interactive fiction story “Zork.” Some of these friends would go on to create one of the earliest video game publishers, Infocom. From 1979 through 1986, Infocom was one of the leading game publishers, marketing games for every major home computer. Purchased by Activision in 1986, the Infocom brand and its classic games live on, a...

Tech News Blues

An Apple II advertisement from the December 1977 issue of Byte magazine, pages 16 and 17. The second page was described the features of the Apple II. The ad originally ran in May 1977 and was updated that December. (Photo credit: Wikipedia ) Visalia Direct: Virtual Valley January 5, 2015 Deadline February 2015 Issue BYTE magazine stopped appearing on newsstands in July 1998. The name lived on for a time as an online publication, without many of its best columnists and without its definitive test lab reports. Finally, in 2009, the real BYTE ceased to exist. Other online publishers revived the name, but it was never the same as the legendary print publication. In November 2014, my favorite online technical resource for Apple power users and developers, OS X Hints, went into archive mode. A month later, on December 16, 2014, Dr. Dobb’s Journal followed BYTE into the virtual sunset after 38 years of publication. In fact, they call it “sunsetting” the publication: Dr. Dobb’s wil...

Tablet Tales: Not Quite Replacing a Laptop

Visalia Direct: Virtual Valley April 1, 2014 Deadline May 2014 Issue Tablet Tales: Not Quite Replacing a Laptop Tablets almost, but not quite, replace notebook and laptop computers. That’s my experience after four months with an iPad Air. Bad weather influenced the decision to purchase a tablet. I was using a rolling case for my computer, cables, adapters, textbooks and student papers. Ice and snow made it impossible to roll the cart up and down the hills of Pittsburgh; rock salt jammed the wheels. I attempted to carry my computer and supplies in a large case with a shoulder strap. My back and shoulders weren’t pleased. Trekking through the snow, I decided my computer weighs too much. I love my 15-inch MacBook Pro, the last model available with an optical disc drive and six ports, but all the options add a fair amount of weight. I still use and record discs for projects, and I prefer having a wired network connection at home for extra speed and security. When I lug the system...

Tablet Time: When Less is Best

Visalia Direct: Virtual Valley November 4, 2013 Deadline December 2013 Issue Tablet Time: When Less is Best My next computer will be a tablet. Yes, I called it a computer because today’s tablets can replace a notebook system for many routine tasks. Though I sometimes need the power and features of a notebook or desktop computer, a tablet is perfect for surfing the Web, answering email, reading books and viewing presentations. When I upgraded from a 12-inch notebook to a 15-inch laptop, the portable computer replaced my desktop system. In return for the extra screen real estate and significant computing power, my carrying case gained weight. Walking across a university campus, the 5.6 pounds of a MacBook Pro plus the weight of its power supply and two video adapters starts to feel like 20 pounds. Most days, I don’t need the power of a laptop in my classroom. I use the laptop to show slides and pages of articles while lecturing. Students do ask to review work and grades, so ...

Apple Tech is Evolutionary, Not Revolutionary

English: Apple IIe computer (enhanced version) (Photo credit: Wikipedia ) Visalia Direct: Virtual Valley October 7, 2013 Deadline November 2013 Issue Apple Tech is Evolutionary, Not Revolutionary Technology revolutions are not as sudden as people believe. Not even Apple has released successful revolutionary products every year or two. “The Myth of Steve Jobs’ Constant Breakthroughs” by Harry McCracken, appeared on Time Magazine ’s Techland site in September, 2013 (http://techland.time.com/). McCraken examines the myth of “revolution” that has lingered after the death of Jobs. You have to feel sorry for chief executive Tim Cook and lead designer Jonathan (“Jony”) Ive, as they try to live up to mythology. Apple, as a company, has a mixed history of innovation. My wife and I are an Apple household. We own an iMac , Mac mini, a collection of MacBook Pro models, iPhones , iPods , and an iPad. Apple dares to deliver products that its designers and engineers want, not what cust...

Letters from the Mailbag

The three PlayStation consoles side by side. (Photo credit: Wikipedia ) Visalia Direct: Virtual Valley May 20, 2013 Deadline July 2013 Issue Letters from the Mailbag Questions and suggestions from readers arrive every month. It’s always nice to help people with a technical question, and many of the questions inspire columns. This month, I’m sharing some questions with short responses. When I don’t have a good answer, I’m sharing that, too. Q: Do you have a favorite gaming console? A: When buying a console, consider the games first. Many games are platform exclusives, especially for the Nintendo consoles. Other games ship first for one or two consoles months or years before the games are available for other devices. The gamers I know tend to own Sony and Microsoft consoles, while parents of young children seem to prefer Nintendo devices. I own a dust-collecting Sony PlayStation 2. Consoles have largely replaced personal computers for gaming, but I dislike the types of ga...

Learning to Code: Comments Count

I like comments in computer programming source code. I've never been the programmer to claim, "My code doesn't need comments." Maybe it is because I've always worked on so many projects that I need comments  to remind me what I was thinking when I entered the source code into the text editor. Most programmers end up in a similar situation. They look at a function and wonder, "Why did I do it this way?" Tangent : I also like comments in my "human" writing projects. One of the sad consequences of moving to digital media is that we might lose all the little marginalia authors and editors leave on manuscript drafts. That thought, the desire to preserve my notes, is worthy of its own blog post — so watch for a post on writing software and notes. Here are my rules for comments: Source code files should begin with identifying comments and an update log. Functions, subroutines, and blocks of code should have at least one descriptive comment. ...

Learning to Code: The Tool(s)

If you want to learn Objective-C, it helps to know C. Learning C — or reviewing it — is a good way to become familiar with Apple's development tools, too. Learning to program is a cause of mine. I advocate teaching programming to all students, not merely a handful of geeks, hackers, or nerds. When we teach everyone about coding, it demystifies how computers work and it introduces students from a wider variety of backgrounds to what could be an excellent career path. Years ago, educators would use LOGO or BASIC in elementary school classrooms. Then, along came HyperCard. There are still introductory programming tools based on LOGO, BASIC, and HyperTalk languages. You can learn to program using AppleScript or by writing Microsoft Word macros in Visual Basic for Applications (VBA). Personally, I'm for using whatever tools a teacher might enjoy at the earlier ages (K-6). In high school, though, I am biased towards plain, simple, C as a foundation for future coding skills. ...

Learning to Code: Starting Point for Objective-C

As readers of this blog know, the more I delve into programming, the more convinced I am that it should be a standard school subject — not merely an elective sought after by a few enthusiastic students. Programming skills reinforce the value of breaking problems into simpler pieces. In the end, a computer must reduce problems to simple tasks. Good writers, historians, chemists… we all break problems into little, digestible, solvable tasks. And just as a musician must practice scales, the basics of programming need to be practiced and sometimes revived. It is no secret that my C skills have atrophied, so I am starting from the beginning. My journey towards Cocoa Goodness begins with two books: Clair, Robert. Learning Objective-C 2.0: A Hands-on Guide to Objective-C for Mac and iOS Developers (2nd Edition) ., 2013. 9780321832085 / 0321832086 Perry, Greg M. Absolute Beginner's Guide to C. 2nd ed., Indianapolis, Ind.: Sams Pub., 1994. 0672305100 In the early pages of Clair...

Computer Languages Change - Like Spoken Languages

Are computer languages inherently "artificial" and "pure" — either like Esperanto or a dead language, such as Latin? Or, are computer languages as much "living" as spoken languages? Understand, I am not considering low-level assemblers or "dead" computer languages that exist in virtual museums (and yes, there are tech archives to explore). I mean the languages that are in wide enough use that programmers develop attachments to them and vocally argue about their futures. In spoken languages, some people are purists. These experts like to "prescribe" grammars and the meanings of words, insisting on a rigid approach to a language. By comparison, some scholars of language as "descriptive" researchers, trying to document a language's evolution. Most scholars, however, are a bit of both — we try to prescribe dominant rules, while accepting change will happen. The French try desperately to maintain an official "Frenc...

Computers Languages, Human Languages

One reason I am writing about learning to code Cocoa apps for OS X and iOS is that I view programming languages as specialized human languages. After all, humans do create the languages. We create computer languages with our notions of what a language should be, from its grammar to its level of abstraction. In my post on the generations of programming languages, I mentioned that languages are interpreted, compiled, or translated, with some variations and complexity within those processes. Let's consider how these compare to the human language process. Compiled Languages About as close to the "natural" machine code as many programmers get on a regular basis, compiled languages remind me of our "native" spoken and written languages. After a while, we think our silent thoughts in a human language. We are entirely unaware of how our brain converts (or compiles) the language into neurological pulses. For me, English seems to be the language of my brain — eve...