Wednesday, September 29, 2010

Reading Notes 5

Reading Notes for 10/04

Database Wikipedia

The whole concept of database infrastructure and management operates to create categorizational partitions to break data down into manageable chunks for search and retrieval, particularly the use of indexing, and the initial structure has to have flexibility built in for growth or the speeds of data maintenance and/or retrieval will decline exponentially. The locks are an easy and ingenious way to make sure you don't have different versions of the same object floating around, creating confusion about what data is the most current and accurate. It seems that the external levels of databases are similar to the GUIs of operating systems.

Intro to Metadata

It would seem advantageous for the different information and preservation communities to put a greater effort into adopting and enhancing a system like the MOAC for greater cross-disciplinary access. The development of different methods of description and searching in hierarchical metadata is a positive development to create a balance between expert and amateur researchers rather than adopting a dumbed-down system to increase accessibility for all to the detriment of the expert user.

While the amount of user-created metadata and folksonomies has become very popular, it seems unlikely that standards can be applied to those descriptions unless they arise organically in a social community. And does a full-text digital surrogate qualify as metadata? I'm not very clear on that.

Overview of the Dublin Core Data Model

The DCMES, built upon the RDF, is creating a semantic description system utilizable across disciplines. Using this schema will hopefully increase the ease of cross-disciplinary research and the interoperability of existing discipline-specific databases. The ability for semantic refinement is a large part of what makes the DCMI relevant, by linking its generic terms in relation to more specific terms used by a discipline's standard metadata system.

Tuesday, September 28, 2010

Muddiest Post 4

Muddiest Point for 9/27

In relation to our discussion of digital image compression, I have recently seen commercials for a new HDTV from Sharp called the Quattron that claims to use a "quadpixel" display utilizing conventional RGB subpixels and adding a Y(yellow) subpixel. It also claims this will enable them to display "trillions" of colors. Given the limitations of the human eye, how much of that difference will we even be able to visually comprehend, and if the files being broadcast to that tv are compressed with a method that uses RGB, can they even add a Y enhancement on the decompression end? I'm confused about how that would work.

Saturday, September 25, 2010

Reading Notes 4

Reading Notes for 9/27

Data Compression
I have used various ZIP files and programs before, so have been familiar with the benefits of data compression without giving much consideration to the process. It is interesting that we can use "lossy" methods to compress audio files because of the limits of human hearing. Couldn't we eliminate those frequencies when creating the original files and keep them smaller to begin with? It seems obvious that the more advanced our visual capture equipment (i.e. digital cameras, HD video recorders, 3D video recorders, etc.) the more difficult it will be for any of these compression styles to shrink file size by a significant percentage.

Imagining Pittsburgh
This is a great specific example of the multitude of challenges that arise in multidisciplinary collaborations. Interdisciplinary and interinstitutional projects are the ideal of open science proponents, and the photographic database created by the project is a great resource that would not have been put together otherwise, but it is probably a recurring difficulty with projects of this nature to get everyone on the same page for issues even as intrinsic as subject language. Imagine the difficulties that must arise when collaborators do not even have the same geographic region in common.

YouTube and Libraries
Libraries need to be less conservative and more innovative in their tactics to gain patrons, and the popularity of YouTube combined with its cost ($0) make it a great tool. I know the Volunteer Services department and the Development department at CLP use it occasionally, but could get much more benefit by using some of the ideas in this article, like online tours, services descriptions, and database tutorials.

Tuesday, September 21, 2010

Muddiest Post 3

Muddiest Point for 9/20

If Unix is superior in so many ways, why doesn't Microsoft build their OS on Unix, making the transition much the same way Mac did in '99-'00? Presumably, they could do this in a manner that would leave their traditional GUI largely unchanged but enhance their system's security, speed, and stability.

Saturday, September 18, 2010

Comments 1

Comments for Week 3

http://megrentschler.blogspot.com/2010/09/week-3-reading-notes.html?showComment=1284857538514#c27441410452898145

http://sarahwithtechnologyblog.blogspot.com/2010/09/week-3-reading-notes.html?showComment=1284858183270#c6338360962293184919

Reading Notes 3

Reading Notes for 9/20

Introduction to Linux: A Hands on Guide
Linux seems like a great OS for those experienced with computer code, giving them the capability to tweak code and develop specific changes to fit their computing needs. However, the number of available distributions combined with the common users fear of the unknown (the unknown being for the majority everything but Windows) makes the idea of transitioning intimidating. While Linux is free, most Windows users likely overlook the cost of the OS because it's a hidden component of the total cost they pay when purchasing their desktop or laptop system. While Linux has made attempts to improve its user-friendliness to appeal to more mainstream users, it still seems best designed for experts looking to optimize some component of that system beyond the capabilities provided by a more mainstream OS.

Mac OS X ( http://www.kernelthread.com/mac/osx/ and 
   
http://en.wikipedia.org/wiki/Mac_OS_X)
Am I the only person who had to look up Apache? After it came up in both of the first two readings I thought it'd be good to have some idea what they were talking about. Now I'm trying to fully understand what a web server is. It is also interesting to read in this 2003 article that the perception at that time was that there was far less software available on Mac OS X, particaularly games. This has clearly changed in the current culture, where as everyone knows, there's an app for that. Doesn't the fact that OS X can only run on Apple hardware limit its selling ability? Has Apple ever considered selling a PC-compatible OS? Maybe then they could possibly grab a market share of more than 4.5%

An Update on the Windows Roadmap
So is Windows a Unix-based system? No, and it's clear that they dominate the market regardless, and it is probably because of this economic position that unlike the Linux and Mac OS X readings there are no mentions of open source collaborations for software development. As a Windows user, I was unimpressed by the changes from XP to Vista, but have been very happy with Windows 7, referred to in this article in the future sense. And while they discuss their awareness of the security issues associated with Windows and the improvements in security with Vista, Windows is still the most vulnerable of the OSs covered in our readings, possibly because of the huge market share they control.


 

Thursday, September 16, 2010

Muddiest Point 2

Muddiest Point for 9/13

When Dr. He discussed the life expectancy of different media formats, he advised users to "refresh" the information on, say, their hard drive every 3-5 years. How does one "refresh" this information, and does doing so help slow the overall degradationof the drive, or just the information?

Friday, September 10, 2010

Reading Notes 2

Reading Notes for 9/13

Computer Hardware
A recurring thought during the first few LIS 2600 readings is a curiosity regarding how many people using a computer on a daily basis actually have a working understanding of what the individual hardware components are and how they function together. I have seen a computer being built and thus feel slightly more educated on what the different components are, but rarely think about what processes they are performing while I utilize them. Firmware is also absolutely critical to a computer's function, but is so easy to ignore or know nothing about. It's easy to see how something small can go wrong (a faulty transistor, firmware or software that hasn't been updated) and the system will become unusable.

Moore's Law and Video
The cost component of Moore's law is a major factor in the rapid exponential development in digital and technical capacity. When integrated circuits with twice the capacity are made available with little increase in cost over their predecessors there is little downside to the computing community for routine adoption of the newest chip. However, the ability to continue doubling is finite, and will likely cap in the next 5-10 years, and while much of what Moore's Law has enabled us to accomplish is positive and revolutionary, the corresponding increases in power consumption and bloat as well as the increase in cost for the producer/developer are not ideal.

Computer History Museum
The speed at which computing has grown and evolved really resonates in the fact that one of the largest international collections of computing artifacts has been built in all of 11 years by an organization put together in 1999. It seems natural, though, that Computer Science should be scrutinized and celebrated in the same way we are used to our more mature sciences being treated. Fitting that it resides in the Silicon Valley, like the Louvre in Paris. The Silicon Engine Timeline was a good tool to help visualize the birth of the transistor and the technological development that led to it. Their collection could help libraries identify what are the documents of greatest value relating to the Computer Science field.

Reading Notes 1

Reading Notes for 8/30

OCLC report: Information Format Trends: Content, Not Containers
One of the statistics that stood out in the OCLC report was the sheer number of project emails sent on a yearly basis. Dr He made a reference to Information Retrieval as the act of finding a needle in a haystack, and we just keep creating bigger and  bigger stacks. It is also interesting to note how quickly mobile devices have led us to expect constant access to the Internet from anywhere at anytime (even when it's not possible, like say from my sister's campground in Tionesta) and how that mobile access is in some parts of the world superseding the more traditional computer-based access because a cellphone is typically a lot less expensive than a pc. I think this change in access combined with our desire for instantaneous fulfillment is a major driving force between the development of micro-payment for micro-content. Basically so much content is being created so quickly and across many different formats, with  much of it also becoming obsolete at almost the same speed, that it's easy to see how difficult the task of meaningful information retrieval can become.

“Information Literacy and Information Technology Literacy: New Components in the Curriculum for a Digital Culture”
While Information Technology is the necessary infrastructure of Information Literacy, the lay user of most popular technologies right now (Twitter, Facebook, SMS, Google) have a very limited knowledge, if any, of how the infrastructure is designed and functions. I am not sure I really buy into the argument that future basic users of Information Technology will need to understand the technology infrastructure of a given program to use it successfully. As technology moves forward so rapidly, the companies behind its development are going to actively design with the user in mind, and the more complex their understanding needs to be the narrower that pool of users/consumers. Having a higher level of IT understanding will remain critical for those involved in Information Science, Storage, and Retrieval, but likely not for the general public because it doesn't make sense from an economic perspective.

Lied Library @ four years: technology never stands still
Lied Library is a great illustration of the ways in which flexibility, connectivity (my first encounter of Internet 2), investment, good software and support, and creativity all have to come together to keep a technologically advanced library on the cutting edge. The costs are large and likely the first deterrent for many academic and public library systems, but the complications go far beyond the costs. Lied's success at bringing people in the door is them meeting their goal, but meeting that goal only increases the issues that crop up from such high use, from the need to limit computer use to the public in order to ensure computer use for the students to the way heavy use on  printers helps pay for their operation but conversely leads to their increased deterioration. Lied Library illustrates through hard- and software issues why it's important for those in the Information Sciences to have a solid IT understanding of the specific tech they work with at the very least.

On a side note, their efforts to create greater connectivity between members of their staff is a small-scale example of why global connectivity is so desirable. They were looking for an increase in communication as a way to foster problem identification and collaboration for creative problem-solving. It's all full circle.

Muddiest Point 1

Muddiest Point for 8/30-First Post

First blog post ever! Here goes. While I think I have a good grasp of what's required in terms of participation and coursework for LIS 2600, I'm a little muddy on the comments we're required to make on other people's posts and questions. I know we're to make two a week, but where do we put them, on that student's blog, on our own, or on a courseweb discussion board? Hopefully as I use blogger that answer may become obvious, but right now I'm not sure.