School of Information Sciences

iSchool Visiting Lectures (Past Lectures)

2016

Friday, April 8, 2016

Dr. Farnam Jahanian

11:30 a.m. -1:30 p.m.
Sennott Square, Room 5317
210 South Bouquet St, Pittsburgh, PA

Dr. Farnam Jahanian, Provost at Carnegie Mellon University

“A Reflection on Computing and Computing Schools of the Future”

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

Provost Farnam Jahanian will discuss the potential and impact of computing research and education. He will also explore the intellectual, programmatic and organizational foundations of future schools of computing.

Bio: Dr. Farnam Jahanian is the Provost at Carnegie Mellon University. He joined CMU in 2014 as the Vice President of Research. Dr. Jahanian served as assistant director for the Computer and Information Science and Engineering at the National Science Foundation from 2011 to 2014. He previously was a professor at the University of Michigan, where he served as Chair for Computer Science and Engineering from 2007-2011 and Director of the Software Systems Lab form 1997-2000. In 2000, he co-founded Arbor Networks, a network security company. His research is aimed at the study of scalability, dependability and security of networked systems and applications. His interests include distributed computing, network security, and network protocols.

Jahanian holds a master's degree and a Ph.D. in Computer Science from the University of Texas at Austin. He is a Fellow of the Association for Computing Machinery (ACM), the Institute of Electrical and Electronic Engineers (IEEE), and the American Association for the Advancement of Science (AAAS).

1

Friday, March 4, 2016

10:00 a.m. -1:00 p.m.
Sennott Square, Room 5317
210 South Bouquet St, Pittsburgh, PA

Andrew Moore, Dean, School of Computer Science, Carnegie Mellon University

“A Perspective on Computing and Schools of Computing”

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

Andrew Moore will discuss computing research and the role of computing in the future. He will also discuss his experience in industry with respect to collaboration and partnering with academia.

Bio: Andrew Moore is the dean of the School of Computer Science at Carnegie Mellon University. His background is in statistical machine learning, artificial intelligence, robotics, and statistical computation for large volumes of data. He has worked in areas of robot control, manufacturing, reinforcement learning, algorithms for astrophysics, algorithms for detection and surveillance of terror threats, internet advertising, internet click-through prediction, ecommerce, and logistics for same-day delivery. He is passionate about the impact of technology (algorithms, cloud, computational biology, artificial intelligence, and software development processes) on the future of society. Andrew received his PhD from the University of Cambridge in 1991 and became a faculty member at CMU in 1993. Moore took a leave of absence from CMU in 2006 to serve as a leader of Google’s Pittsburgh office. In 2011, he was vice president of engineering at Google Commerce. Moore returned to CMU in 2014 to become dean of the School of Computer Science.

1

Thursday, February 11, 2016

11:00 a.m. - Noon
IS Building, 3rd Floor Theatre

Richard DeMillo, Charlotte B. and Roger C. Warren Professor of Computing and Professor of Management, former John P. Imlay Dean of Computing, and Executive Director of the Center for 21st Century Universities at the Georgia Institute of Technology

“A Perspective on Computing, Schools of Computing, and Higher Education”

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

In his presentation, Dr. DeMillo will discuss the potential of computing and explore the intellectual, programmatic and organizational foundations of computing academic units of the future. He will also discuss innovation and change in colleges and universities that are currently happening at a fast pace, focusing on initiatives that will define the next generation of educational practices and technologies. 

Bio: Richard DeMillo is the Charlotte B. and Roger C. Warren Professor of Computing and Professor of Management, former John P. Imlay Dean of Computing, and Executive Director of the Center for 21st Century Universities at the Georgia Institute of Technology. Author of over 100 articles, books, and patents, he has held academic positions at Purdue University, the University of Wisconsin, and the University of Padua. He directed the Computer and Computation Research Division of the National Science Foundation and was Hewlett-Packard's first Chief Technology Officer. He is the 2013 Lumina Foundation Inaugural Fellow which recognized his founding of the Center for 21st Century Universities as a “unique institution.” He is also a Fellow of both the American Association for the Advancement of Science and the Association for Computing Machinery. He is the author of the influential 2011 book “Abelard to Apple: The Fate of American Colleges and Universities” and a 2015 sequel entitled “Revolution in Higher Education: How a Small Band of Innovators will Make College Accessible and Affordable. ” Both books were published by MIT Press.

1

Monday, February 8, 2016

11:00 a.m. - Noon
IS Building, 3rd Floor Theatre

Cliff Lynch, Executive Director, Coalition for Networked Information (CNI)

“Big Challenges and Schools of Information”

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

In our work at the Coalition for Networked Information (CNI) we have tried to make progress in several areas, including redesigning scholarly communication, facilitating archiving and stewardship of the cultural record, and managing balances among privacy, analytics, and information flows. I'll describe some features of the challenges in each of these areas; what they have in common is that they are complex, very large scale, and critical to the future of our society. The portfolio of scholarly work hosted at schools and departments of information, computer and computational science, and emerging units addressing data science have certainly contributed in these areas, but I believe that they need to do much more, and I'll examine some of the barriers, opportunities and specific collaborations that I believe are needed to accelerate progress.

Bio: Clifford Lynch has led CNI since 1997. CNI’s wide-ranging agenda includes work in digital preservation, data intensive scholarship, teaching, learning and technology, and infrastructure and standards development. Lynch spent 18 years at the University of California Office of the President, the last 10 as Director of Library Automation. Lynch, who holds a PhD in Computer Science from the University of California, Berkeley, is an adjunct professor at Berkeley’s School of Information.  He is both a past president and recipient of the Award of Merit of the American Society for Information Science, and a fellow of the American Association for the Advancement of Science and the National Information Standards Organization. In 2011 he was appointed co-chair of the National Academies Board on Research Data and Information. His work has been recognized by the ALA’s Lippincott Award, the EDUCAUSE Leadership Award in Public Policy and Practice, and the American Society for Engineering Education’s Homer Bernhardt Award.

1

Friday, February 5, 2016

Bobby Schnabel3:00 p.m. - 4:00 p.m.
IS Building, 3rd Floor Theatre

Bobby Schnabel, CEO, Association for Computing Machinery (ACM)

“ACM Overview and Strategies Priorities”

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

This will be an interactive discussion of ACM programs and strategic priorities, intended to solicit comments, suggestions and discussion from the audience. We will give an overview of ACM’s scope, conferences and publications, international activities, and activities that support the broad computing community in areas including diversity, education, and policy. Then we will present a summary of ACM strategic priorities including: becoming a fully international society; continually evolving to reflect the latest technical developments and applications in computing; strengthening ties with the entrepreneurial community; playing a broad leadership role in diversity; providing increased value to computing practitioners; and continually evolving its modes of communication with ACM members and the world.

Bio: Bobby Schnabel is CEO of the Association for Computing Machinery (ACM), the oldest and largest society of computing professionals and students. ACM is an international society of over 110,000 members that publishes over 50 journals and magazines and conducts nearly 500 conferences and workshops per year.

Prior to beginning as CEO of ACM in Nov. 2015, Schnabel was dean of the School of Informatics and Computing at Indiana University (IU) from 2007-2015. In this position he led a multi-campus school of over 150 faculty and 3500 students at the Bloomington and Indianapolis campuses, including undergraduate and graduate programs in computer science and informatics, and graduate programs in data science, information science, and library science. From 2009-2010, he also served as interim IU vice president for research.

Prior to joining IU, Schnabel was on the computer science faculty of the University of Colorado at Boulder from 1977-2007. He served as vice provost for academic and campus technology and chief information officer at the University of Colorado at Boulder from 1998-2007, and as founding director of the Alliance for Technology, Learning, and Society (ATLAS), a campus-wide information technology institute, from 1997-2007. Schnabel is a co-founder and executive team member of the National Center for Women & Information Technology. He currently serves on the advisory committee for the NSF Directorate for Computer and Information Science and Engineering and for the Computing Alliance of Hispanic-Serving Institutions (CAHSI), and on the board of code.org. He is a fellow of ACM and of SIAM.

1

Thursday, January 21, 2016

3:00 p.m. - 4:00 p.m.
IS Building, 3rd Floor Theatre

Alex Beutel, PhD candidate, Carnegie Mellon University, Computer Science Department

"Beyond Who and What: Answering How? and Why? by Modeling Large Graphs"

Can we model how fraudsters work to distinguish them from normal users? Can we predict not just which movie a person will like, but also why? How can we find when a student will become confused or where patients in a hospital system are getting infected? How can we effectively model large attributed graphs of complex interactions?

In this talk we will focus on user behavior modeling through understanding heterogeneous graphs. Online, users interact not just with each other in social networks, but also with the world around them. These interactions often include insightful contextual information, such as the time or location of the interaction and ratings or reviews about the interaction. We demonstrate that through modeling these large heterogeneous graphs and their rich contextual information, we can improve both anomaly detection and prediction algorithms. By modeling how fraudsters work, our anomaly detection algorithms can better differentiate fraudsters from honest users; by carefully modeling user ratings and reviews, we can predict not just which item a user would like but also why. Additionally, we will demonstrate that by understanding the structure of these models we can build flexible platforms for scalable modeling of large graphs. Finally, we will discus the future of graph modeling, covering new exciting applications, novel modeling approaches and upcoming challenges in scalable machine learning.

Bio: Alex Beutel is a PhD candidate at Carnegie Mellon University in the Computer Science Department. He previously received his BS from Duke University in Computer Science and Physics. His primary interest is in modeling large graphs, with his PhD thesis focused on large-scale user behavior modeling, covering recommendation systems, fraud detection and scalable machine learning. Beyond his research at CMU, Alex has worked on large-scale user behavior modeling at Facebook, Google, and Microsoft.

1

Tuesday, January 19, 2016

1:45 p.m. - 2:15 p.m., Presentation
2:15 p.m. - 2:30 p.m., Q&A
2500 Wesley W. Posvar Hall

Lieutenant General Robert E. Schmidle, Jr., USMC, Principal Deputy Director, Cost Assessment and Program Evaluation

"Policy and Legal Implications of Emerging Cyber Capabilities"

This event is co-sponsored by the University of Pittsburgh School of Information Sciences, the University of Pittsburgh School of Law, and the University of Pittsburgh Graduate School of Public Affairs.

Join us for a special presentation as Lieutenant General Schmidle shares his perspective on the policy and legal implications of emerging cyber capabilities. This multi-disciplinary topic will engage and interest the entire Pitt community, and is particularly relevant to students and faculty in the law, information, policy, and technology sectors. Don't miss this rare opportunity to interact with one of the highest ranking members of the U.S. Marine Corps.

Bio: Lieutenant General Robert E. Schmidle, Jr., USMC, was appointed as the Principal Deputy Director of Cost Assessment and Program Evaluation, Office of the Secretary of Defense in March of 2014. In his current appointment, Schmidle is responsible for analyzing and evaluating plans, programs, and budgets in relation to U.S. defense objectives and resource constraints.

His command assignments include Commanding General, First Marine Aircraft Wing; Commanding Officer, Special Purpose Marine Air-Ground Task Force (Experimental); and Commanding Officer, Marine Fighter/Attack Squadrons 251 and 115. Previous operational assignments include multiple tours flying the F-4 and F/A-18 aircraft as well as serving as the operations officer and air officer of an Infantry Battalion, First Battalion, 9th Marines.

Additionally, Schmidle has served in the following key staff assignments: Deputy Commandant for Aviation; Deputy Commander for U.S. Cyber Command; Assistant Deputy Commandant of the Marine Corps for Programs and Resources (Programs); Deputy Chief of Staff for Integrated Product Team 1 for the 2006 Quadrennial Defense Review and USMC lead for the 2010 Quadrennial Defense Review; Deputy Director for Resources and Acquisition in the Joint Staff J-8; Director of the USMC Expeditionary Force Development Center; and the Military Secretary for the 32nd and 33rd Commandants of the Marine Corps.

Schmidle is a native of Newtown, Connecticut and holds a bachelor’s degree from Drew University, a master’s degree from American University, and earned his doctorate from Georgetown University. He has been published in the fields of moral philosophy, social psychology, and military history. He is also a distinguished graduate and prior faculty member of the Marine Corps Command and Staff College, as well as a distinguished graduate of the Marine Corps War College.

1

Thursday, January 14, 2016

3:00 p.m. - 4:00 p.m.
IS Building, 3rd Floor Theatre

Evangelos Papalexakis, PhD candidate, Carnegie Mellon University, School of Computer Science

"Big Signal Processing for Multi-Aspect Data Mining"

Abstract: What does a person's brain activity look like when they read the word apple? How does it differ from the activity of the same (or even a different person) when reading about an airplane? How can we identify parts of the human brain that are active for different semantic concepts? On a seemingly unrelated setting, how can we model and mine the knowledge on web (e.g., subject-verb-object triplets), to find hidden patterns and missing links? My proposed answer to both problems (and many more) is through bridging signal processing and large-scale multi-aspect data mining.

Specifically, language in the brain, along with many other real-word processes and phenomena, have different aspects, such as the various semantic stimuli of the brain activity (apple or airplane), the particular person whose activity we analyze, and the measurement technique. In the above example, the brain regions with high activation for "apple" will likely differ from the ones for "airplane." Nevertheless, each aspect of the activity is a signal of the same underlying physical phenomenon: understanding language in the human brain. Taking into account all aspects of brain activity results in more accurate models that can drive scientific discovery (e.g, semantically coherent brain regions). 

In addition to the above Neurosemantics application, multi-aspect mining appears in numerous applications such as mining knowledge on the web (where different aspects of the data include entities in a knowledgebase and the links between them or search engine results for those entities) and multi-aspect graph mining (with the example of multi-view social networks) where we observe social interactions of people under different means of communication, and we use all views/aspects of the communication to extract more accurate communities.

The main thesis of my work is that many real-world problems, such as the aforementioned, benefit from jointly modeling and analyzing the multi-aspect data associated with the underlying phenomenon we seek to uncover. In my research, I develop scalable and interpretable algorithms for mining big multi-aspect data, with emphasis on tensor factorization. In this talk, I will discuss multi-aspect data applications, focusing on Neurosemantics, and present my algorithmic work on scaling up tensor factorization by two orders of magnitude and assessing the quality of the results. I conclude with my future vision on bridging Signal Processing and Data Science for real-world applications.

Bio: Evangelos (Vagelis) Papalexakis is a PhD candidate at the School of Computer Science at Carnegie Mellon University (CMU). Prior to joining CMU, he obtained his diploma and MSc in Electronic & Computer Engineering at the Technical University of Crete, in Greece.

His research interests span the fields of data science, data mining, signal processing, and machine learning. His research involves designing scalable algorithms for mining large multi-aspect datasets, with specific emphasis on tensor factorization models, and applying those algorithms to a variety of real world, multi-aspect data problems.

His work has appeared in KDD, ICDM, SDM, ECML-PKDD, WWW, PAKDD, ICDE, ICASSP, IEEE Transactions of Signal Processing, and ACM TKDD. He has a best student paper award at PAKDD'14, finalist best papers for WWW Web Science Track '15, SDM'14 and ASONAM'13, and he was a finalist for the Microsoft PhD Fellowship and the Facebook PhD Fellowship. Besides his academic experience at CMU, he has industry research experience working at Microsoft Research Silicon Valley during the summers of 2013 and 2014 and Google Research during the summer of 2015.

1

Monday, January 11, 2016

2:00 p.m. - 3:00 p.m.
IS Building, 3rd Floor Theatre

Harry Bruce, Dean, University of Washington Information School

"Creating an Intellectual Hub for Computing and Information Science at the University of Pittsburgh"

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

Abstract: In this presentation, Harry Bruce (Dean and Professor, UW Information School) will share his experience with the restructuring of intellectual areas at the University of Technology, Sydney and at the University of Washington, Seattle. Based upon his observation of these efforts to create an integrated, highly productive, highly effective intellectual community from multiple constituencies, he will share his thoughts on why this goal is so difficult to achieve. Professor Bruce will identify important issues to be considered with a proposal to merge Computer Science and the School of Information Sciences at the University of Pittsburgh.

Bio: Harry Bruce is the Dean of the University of Washington Information School. His research and teaching focus on human information behavior, information seeking and use, and personal information management in networked information environments. Dr. Bruce’s research has been funded by the National Science Foundation, the Institute of Museum and Library Services (IMLS), the Washington State Library, and the Australian Department of Employment Education and Training.

1

2015

Monday, November 23, 2015

10:45 a.m., Refreshments
11:00 a.m. - 12:00 p.m., Talk
IS Building, 3rd floor Theatre

John Leslie King, Professor, University of Michigan and London School of Economics and Political Science

"The World Turned Upside Down: Information Research in Difficult Terrain"

This event is co-sponsored by the University of Pittsburgh School of Information Sciences’ Visiting Lecture Series & the CS/SIS Collaboration Project.

Abstract: Many are uneasy with the disconnect between changes in information (including information technologies) and research results that seem meager in comparison. Those drawn to the information fields by great changes often face constraints. When choosing between research that “matters” and research that is “safe” many choose safety. Who can blame them? For any given study or paper this choice might make sense. Over the long run it does not. Keeping “big ideas” in mind while doing work that “counts” for short-term rewards (getting promoted, etc.) takes thinking and work, but it is manageable. The talk provides reasons why to do this and some suggestions about how. It is basically optimistic.

Bio: John Leslie King is W.W. Bishop Professor of Information, former Dean of the School of Information and former Vice Provost at the University of Michigan. He joined Michigan in 2000 after twenty years on the faculties of computer science and management at the University of California at Irvine. He has published more than 200 books and papers from his research on the relationship between technical and social change. He was Marvin Bower Fellow at the Harvard Business School, distinguished visiting professor in Singapore (at both the National University of Singapore and at Nanyang Technological University), and Fulbright Distinguished Chair in American Studies at the University of Frankfurt. He is currently Visiting Professor at the London School of Economics and Political Science. He was Editor-in-Chief of the INFORMS journal Information Systems Research, and served as associate editor for other journals. He has been on the Board of the Computing Research Association (CRA), the Council of the Computing Community Consortium, and the U.S. National Science Foundation Advisory Committees for Computer and Information Science and Engineering (CISE), Social, Behavioral and Economic Sciences (SBE), and Cyberinfrastructure (ACCI). His PhD is in administration from the University of California, Irvine. He received an honorary doctorate in economics from Copenhagen Business School. He is an elected fellow of the Association for Information Systems and the American Association for the Advancement of Science.

1

Monday, March 23, 2015

11:00 a.m.
IS Building, Room 404

Meet the Author: Anthony Clark will discuss his book The Last Campaign: How Presidents Rewrite History, Run for Posterity & Enshrine Their Legacies

The Last Campaign explores the hidden politics & history of the taxpayer-funded, uniquely American shrines known as presidential libraries. Unrestrained commemoration, unregulated - and undisclosed - contributions, and unchecked partisan politics have radically altered the look and purpose of presidential libraries, changing them from impartial archives of history into extravagant, legacy-building showplaces where the goals of former presidents, their families, financial donors, and the national parties trump accuracy and the (often inconvenient) facts. Using primary source documents from his research at every presidential library, as well as his own analysis of the museums and public programs, Anthony Clark examines important aspects of the presidential library system, including:

•   How and why presidents choose the location for their libraries, in what is known as the site selection process;
•   The purpose and influence of the private foundations that build, and continue to operate within, presidential libraries;
•   The laws regulating access to presidential records, and how decisions made by the National Archives, the foundations, and Congress prevent those records from being opened for 100 years or more;
•   The "history" that is displayed in exhibits at presidential libraries, which purport to be public but which are in fact private, and often skewed and inaccurate;
•   The politicization of modern presidential libraries, which serve as vehicles to advance partisan interests and anoint future party leaders and candidates;
•   And suggestions for ways that Congress and the National Archives can improve, strengthen, and reform presidential libraries.

The author also describes his own attempts to bring Congressional oversight and reform to the presidential library system, as well as his earlier experiences as an independent researcher, struggling to gain access to more than three quarters of a million pages of the National Archives' own records on presidential libraries. These efforts resulted in the largest single FOIA release in the history of the National Archives. Americans deserve fair and accurate history in the libraries for which we pay; history based on records, not politics. But while presidents run for posterity, dedicating their self-congratulatory museums an average of four years after leaving office (complete with exhibits created to glorify them and their achievements), the records that show what actually happened won't be opened for more than a hundred years...unless we decide to do something, and reform our presidential libraries.

Bio: Anthony Clark is a former legislative director, speechwriter, and committee professional staff member in the U.S. House of Representatives. In the 111th Congress, he was responsible for oversight and investigations of presidential libraries, the National Archives, and all federal information policy for the House Committee on Oversight and Government Reform. He writes about presidential legacy and Congress, and has been published by Time, Salon, and History News Network. His article on the George W. Bush Presidential Library was the cover story at Salon.com the day the library was dedicated in April, 2013, and was featured that evening in a segment on MSNBC's The Last Word with Lawrence O'Donnell.

A recognized expert in the Freedom of Information Act, federal records, and presidential records and libraries, he has been interviewed about presidential libraries by Roll Call, Reuters, Chicago Public Radio (WBEZ), the Chicago Tribune, Crane's Chicago Business, the Honolulu Star-Advertiser, New York Public Radio (WNYC), the Orange County Register, Pacific Standard Magazine, KCRW, and Lettera43, the Italian newsmagazine. He earned a Master’s degree in Management & Systems from New York University, and spent eighteen years as an information technology consultant.

Anthony is the author of The Last Campaign, How Presidents Rewrite History, Run for Posterity & Enshrine Their Legacies, a book about the politics of presidential libraries, to be published on March 16, Freedom of Information Day.

1

2014

Tuesday, November 4, 2014

1:30 p.m. -2:00 p.m., Meet the Speaker with coffee and cookies
2:00 p.m., Talk
IS Building, 3rd floor

Atsuyuki Morishima, Professor of Research Center for Knowledge Communities, Graduate School of Library and Information and Media Studies, University of Tsukuba

Crowd4U: Toward an Earth-scale Volunteer Network for Microtask-based Crowdsourcing

Abstract:This talk overviews the FusionCOMP project that addresses problems in software engineering for data-centric crowdsourcing applications. The project started in 2009 and have developed a programming language named CyLog and its execution platform Crowd4U. CyLog is a Datalog-like language that allows some predicates to be evaluated by humans. Crowd4U has an execution engine for CyLog codes and is being used for public and academic crowdsourcing projects involving more than 20 universities. This talk explains our challenges and some of the results we obtained in the project.

Bio: Atsuyuki Morishima is a professor of Research Center for Knowledge Communities, Graduate School of Library and Information and Media Studies, University of Tsukuba, Japan. For three years, he has been working with domain experts on several volunteer-based crowdsourcing projects in the library and natural disaster domains. He served as program committee members and conference officers of conferences including VLDB, SIGMOD, ICDE, EDBT, CIKM, ECDL, iConference, DASFAA, ICADL, XSym, PersDB, and DBCrowd.

1

Monday, November 3, 2014

11:00 a.m.
IS Building, 3rd floor

Marios Kokkodis, PhD Candidate at NYU Stern School of Business (Information Systems)

Inefficiencies in Online Labor Markets

Abstract: In an online labor marketplace (OLM) employers post jobs, receive worker applications, and make hiring decisions. Once hired, workers complete the tasks online and receive their payment along with feedback about their performance. Because of the natural heterogeneity that appears in task categories, skills, and the latent abilities of workers and employers, these markets stuffer from a series of inefficiencies. In this talk I will focus on three problems: (1) How do employers make hiring decisions:  (2) How should workers expand their skill set? (3)How does past experience transfer to new tasks?:  This work has a series of very important implications for the marketplace and its users: First, employers make better-informed and faster hiring decisions. Furthermore, workers build up their demand by strategically improving their skillset.  As a result, the marketplace becomes more efficient, and  experiences an increase in transaction volume as well as an increase in the overall satisfaction of the workers and the employers.

Bio: Marios Kokkodis is currently a fifth year PhD candidate in the Information Systems group at NYU Stern School of Business. His advisor is prof. Panos Ipeirotis. His research focuses  on inefficiencies in online labor markets (e.g., oDesk.com). He experiments on more than 3.5 million job applications of oDesk.com to show evidence that his approaches reduce friction and create a more efficient marketplace. His research has been accredited by multiple awards (INFOMRS Data Mining Student award, HCOMP Doctoral consortium) and publications (Management Science, ICIS, WSDM).

1

Wednesday, October 29, 2014

6:30 p.m., Meet the Speaker with pizza
7:00 p.m., Talk
IS Building, Room 501

Joe Trost, Director of Software Verification Test (SVT) at Tollgrade Communications, Inc.

SmartGrid– An Introduction to Medium Voltage Utility Power Line Sensors

Abstract: Smart Grid reliability starts with better visibility into the backbone of the grid – the distribution network. With distribution monitoring systems, there is better detection of faults with real- time information (e.g. type of fault, cause and location); classify different types of line disturbances; and continuously monitor load and power quality across all three phases of a medium voltage distribution network. Monitoring is made possible with Medium Voltage (MV) Sensors with these features: inductively powered with flexible communications offered through cellular or Wi-Fi with operations down to 3 amps. Key sensor measurements include: load current, fault current, electric field strength, power factor, phase angle, sags, surges, wire temperature and harmonics. This talk will include a demonstration of sensors from Tollgrade. We will power them up and simulate power events on the power line so we can watch the sensor “see” the events and transmit them back to the management station.

Bio: V. Joseph Trost started tinkering with computers while in high school back in 1976. His dad had remote access to banking industry mainframes and Joe could login after hours and play computer games.  During the 1980’s Joe was a computer operator in New York City and eventually took a job with a software consulting firm as a programmer analyst writing code on COBOL, Fortran, Pascal and C. Late in the 80’s, Joe recognized the potential of computer networking and decided to focus his studies on a new technology called the “Internet”. That led him to Rich Thompson and the Telecommunications Program at the University of Pittsburgh and then to various positions at FORE Systems, Marconi Communications, Ericsson and Tollgrade.

Joe is currently the Director of Software Verification Test (SVT) at Tollgrade Communications, Inc. located in Cranberry Township (30 minutes North of Pittsburgh). The focus of Joe’s work at Tollgrade is related to Quality Assurance Testing of the LightHouse (SmartGrid) Product Line. LightHouse consists of various hardware and software products that allow the electrical utilities to monitor the status of their distribution networks in real-time.

Prior to working at Tollgrade, Joe was the Director of ATM Engineering for Ericsson (Communications Infrastructure Company based in Sweden). Joe’s experience in ATM Networking Technology started in the Technical Assistance Center (TAC) at FORE Systems back in 1995. While at FORE, Joe designed one of the largest ever ATM Networks for the National Security Agency at Fort Meade in Maryland. Joe holds a Master of Science in Telecommunications from the University of Pittsburgh and a Bachelor of Science in Decisions Sciences from Rider University.

1

2013

Friday, March 29, 2013

10:30 a.m.
IS Building, Room 403

David Wallace, Clinical Associate Professor (commencing September 2013) at the School of Information, University of Michigan

Burdened with the Truth’: (Wiki)-leaking and the Unbalancing of Ritualized Access

This event is sponsored by the Society of American Archivists Student Chapter.

Wikileaks seemed to emerge out of nowhere in 2010 and in rapid succession issued forth the largest ever series of leaks of classified and confidential US government documents. But actually Wikileaks had been at work since 2007 and had ingested and exposed a wide array of documents evidencing corruption and other malfeasances across the globe. The 2010 leaks were of a different magnitude, however, and resulted in sharp rebukes from governments, human rights organizations, corporations, the mainstream media, and even the public. Why had Wikileaks elicited such a sharp response from so many sectors of society? Compelling perspectives beyond the simplified explanations that dominated media coverage noted that the leaks surfaced deep, disturbing, and embarrassing contradictions between the public rhetoric and private (classified) actions of leading global powers, and that their contents mocked the legitimacy and performance of accountability mechanisms. By rupturing established structures of information control, access, and dissemination the leaks raised powerful questions over the potential of “new” technologies to spur overt contestation over what exists in the public domain. And Wikileaks is not alone. It is only one actor within a larger “leaking movement” that now exists on the open web. Given these realities, how might the archival and records management professions navigate this terrain? This talk will review the (Wiki)-leaking phenomenon, examine how it can be interpreted and understood within archival and records management professional practice frameworks and philosophies. It will also examine the challenges that web technologies and web activism present to core professional responsibilities to develop a meaningful and representative record in an age of global turmoil. 

David A. Wallace, Ph.D. is Clinical Associate Professor (commencing September 2013) at the School of Information, University of Michigan. He has been a full-time graduate archival educator since 1997. For over two decades, he has published and presented in a wide range of professional forums, examining: recordkeeping and accountability; archiving and the shaping of the present and the past; social justice impact of archives; freedom of information; government secrecy; professional ethics; electronic records management; and graduate archival education. He is co-editor of Archives and the Public Good: Accountability and Records in Modern Society (2002), and served as the series technical editor for twelve volumes of the National Security Archive's The Making of U.S. Policy series (1989-1992). In 2001 he received ARMA International's Britt Literary Award for best article in their Information Management Journal. He has consulted widely, including substantial associations with the South African History Archive’s Freedom of Information Programme, and Stories For Hope, an intergenerational storytelling NGO in Rwanda.

Monday, March 4, 2013

10:00 a.m.
IS Building, Room 501

Thomas Baker, CIO of the Dublin Core Metadata Initiative

Rethinking the Library Catalog as Linked Data

A fixed data format, MARC, has provided coherence to library catalogs for over forty years. However, in the age of Linked Data, it is now seen as relegating catalog data to an isolated (and expensive) silo.  While the imminent death of MARC has been officially announced, its Linked Data-based replacement – the Bibframe Initiative of the Library of Congress -- is still in the early phases of design.  Meanwhile, the usage guidelines which for decades have co-evolved with MARC -- Anglo-American Cataloging Rules -- are being replaced by Resource Description and Access (RDA), now based on Functional Requirements for Bibliographic Records (FRBR), a conceptual model which remains largely untested in practice.  Enter Google, whose weighty Schema.org initiative overlaps with all of the above.  These initiatives take fundamentally different approaches to translating the catalog into languages of the Semantic Web.  This talk discusses "ontological" differences and practical consequences for the interoperability of library catalogs in a Linked Data environment.

Tom Baker is CIO of the Dublin Core Metadata Initiative (DCMI).  He has co-chaired the W3C Library Linked Data Incubator Group and the W3C Semantic Web Deployment Working Group, which published Simple Knowledge Organization System (SKOS).   Tom holds an MLS from Rutgers University and an MA and PhD in Anthropology from Stanford University.  He has worked as a researcher at an economic institute in Italy; taught at the Asian Institute of Technology in Thailand; led projects at the German National Research Center for Information Technology (GMD, later Fraunhofer) and the Goettingen State Library; and consulted with organizations such as the UN's Food and Agricultural Organization (FAO).

2012

Wednesday, October 10, 2012

2:00 p.m.
IS Building, Room 522

Lora Aroyo, Associate Professor, The Network Institute, VU University Amsterdam

"Dial E for Event - or Harnessing Disagreement in Text Annotation through Crowdsourcing"

The focus of this talk is on using crowdsourcing as a source for various types of semantics in order to improve the experiences of users with collections online. The talk will give a brief overview of multiple initiatives in this area, however the main focus with be on semantics of events. Events appear more and more as central in different domains and applications. They often help bring meaning to objects online, tasks in a ubiquitous scenarios and in numerous information management structures. Thus, events become more and more wanted, and in the same time their semantics are most vague and unclear. In this talk, I will present an approach, based on the theory that the disagreement among humans about events constitute a natural state, where we can harness this disagreement through crowdsourcing to bring a new kind of meaning to events. Objects, like people, locations, and various other types of named entities, are often easy to detect in language and present on the semantic web. Without events, however, they lack meaning. Assigning roles to objects in events is a step towards bringing them meaning, but the detection and representation of events is much harder than for objects; they are typically not named and humans have difficulty identifying them and distinguishing their boundaries, as well as linking and ordering them consistently. Many event-centric approaches in NLP have attempted to “fix” the problem of human disagreement regarding events by over-specifying their semantics for isolated tasks, but this leads to brittleness and lack of coverage.

Lora Aroyo is an associate professor at The Network Institute, Department of Computer Science, VU University Amsterdam (currently on sabbatical leave at IBM Research). Her research focuses on using semantic web technologies for modeling user interests and context and applying them in recommendation systems and personalized access to online cultural heritage collections, multimedia archives and interactive TV. She has coordinated the CHIP project on Cultural Heritage Information Personalization and the NoTube project on the integration of Web & TV with the help of semantics. She has co-organized numerous workshops on personalized access to cultural heritage, e-learning, interactive television, visual interfaces to the social and semantic web (e.g. PATCH, FutureTV, PersWeb, VISSW and DeRIVE). Lora is actively involved in the Semantic Web community, i.e. program co-chair for ESWC2009 and ISWC2011 and conference chair for ESWC2010. She is also actively involved in the Personalization and User modeling community as vice-president of UM Inc. and a member of the editorial board for the UMUAI journal.

2011

2010

2009

2007

2006

2005

2004

Colloquia

It is part of the School's mission to disseminate research ideas and findings through Colloquia. New students and faculty enjoy this vibrant intellectual community.