ADAM N. ROSENBERG, Ph.D.
7828 East Pleasant Run Court
Scottsdale, Arizona 85258-3106
      1-480-948-1656 home
      1-480-882-8839 cell
            adam@the-adam.com
           
http://the-adam.com

    

WORK OBJECTIVE

     I enjoy difficult mathematical decision support problems and have found that my greatest strength is my ability to find practical ways to add value to many different types of business operations. Along the way, I have acquired in-depth knowledge of several industries and have met some very interesting people. I hope to take on new challenges and to continue to broaden my experience.

    

CAREER SUMMARY (1982-2011)

    

     For twenty-nine years I have envisioned, designed, developed, and delivered decision support systems that add economic value, several hundred million (U.S.) in yield management and airline fleet assignment that can be demonstrated and quite a bit more than that helping other people become more productive. I have worked within teams and alone in a wide variety of industries and work settings. My projects include big optimizations, large simulations, complex analyses, intricate simulations, and small tools that save time. My common theme is working within existing information infrastructure using traditional methods in non-traditional ways to produce enduring and beneficial solutions.

    

    

WORK EXPERIENCE (FULL TIME)

    

     Clear Demand, Chief Science Officer, 2011 October through the present.

     I decided to start my own retail-science company and to extend the mathematical science we used a decade ago to solve the price-optimization problem on today's faster, cheaper computers. So far I've developed a retail price optimizer that uses a new maximum-likelihood model and a better tradeoff algorithm for making money while conforming to business rules. Two friends joined me in 2012 April, we incorporated Clear Demand, and we're on our way.

    

     US Airways, Scientist, 2010 June through 2011 September.

     US Airways hired me with a specific mission, to bring retail science insights to airline revenue management (RM). Classical RM models, also called "yield management," deals with passengers tied to specific products and restrictions (Saturday night stay, for example). Twenty-first-century airline passengers buy tickets from web pages and other channels where price is a primary consideration, just like retail. I'm using my Khimetrics experience to reshape the mathematics being used to book passengers.

     My first step was a competitive-booking simulation of price-sensitive travelers trying to book travel on multiple-carrier airline networks. Options include "classic" algorithms similar to those running today and forward-looking algorithms. More recent work includes formulating revenue-management models to carry out the forward-looking methods tested in simulation.

     I've also had a technical-leadership role in the Revenue-Management Operations-Research group. Mathematical algorithms have complicated data requirements and unintended side effects, so somebody who has evaluated airline algorithms before at Northwest Airlines can see complex interactions more quickly.

    

     Blackboard Inc., Consultant, 2010 January through 2010 March.

     This was a short contract with a software company. The Blackboard product works in educational institutions all over the United States. Problems with the then-current version prompted them to hire me to help them work through similar issues in the upcoming release. After some basic exploration they decided not to continue this line of work.

    

     MyWorld Investing, Consultant, 2009 April through 2009 December.

     Analysis of commodity investing with collateral loan: Changes in United States government spending suggest trends in commodity prices over the next five years. Investors in gold, silver, platinum, or copper might want to borrow money to invest in these precious-metal markets. My computational output is a strategy for specific loan, collateral constraints, and market expectations over time. Using this dynamic program, we can evaluate the expected return on investment and its statistical distribution and also how that return is affected if the market conditions are different from what we expected.

     Evaluation of asset forecasting to portfolio value over time: Using daily data on asset prices and forecasters for thousands of assets and hundreds of forecasters over ten years, this consulting project produces software and computational results showing value of single- and multiple-forecaster contributions to a portfolio selecting assets from this list.

    

     Property Informatics, Chief Scientist, 2008 March through 2009 March.

     Medical Site Selection: Using United States census data and doctor office location data from SK&A, we built a database of quarter-mile "bins" with doctors (servers) and population (users). I designed and developed a program that used the Huff model from retail and some nice computation tricks to generate a "heatmap" with a patient-count estimate for each bin based on proximity and other attractive attributes including being near a freeway or near a hospital. I worked with my associate to build a more-robust database using open-source PostgreSQL with more precise data from the census. We have one client so far and he's happy with the report.

     Forecasting Commercial Real Estate: With economic indices (including Phoenix-area "blue chip" business conditions indices) and real-estate values including vacancy, rental rates, new construction, occupancy, and absorption, I wrote a time-series forecast engine that finds the economic indices that best predict the real-estate values. We demonstrated this with my graphics and impressed a lot of people, but we found it's hard to turn a forecast into a product. Later, I worked with an associate who wrote the application in MATLAB with superior graphics and user interface.

    

     SAP/KhiMetrics, Staff Scientist, 2003 October through 2008 March.

     Sales Aggregation Module (SAM): When I got to Khimetrics, the software and data systems lagged behind the models, forecasts, and price optimization, so I built a data-cleansing engine called SAM. Still in regular production use for several clients, SAM figures out out-of-stock time periods and short promotions based on unit sales, removes outliers, and integrates the sales data with the promotion calendar. Ideally, out of stock conditions and short promotions should come from client data, but they don't, so we infer them.

     Markdown Optimization Engine (MOE): With modeling, forecasting, and pricing already in the product line, Khimetrics moved on to markdown, clearance of obsolete inventory (not fashion markdowns which are price promotions of current inventory). The team effort for markdown was, in my opinion, the high point of Khimetrics where user interface, database, customer interaction, analysis, and algorithm software came together. I was responsible for designing, documenting, and writing MOE, the optimization software for markdown. The client reported major savings, 20% increase in inventory sold at nearly 20% higher price for tens of millions of dollars improvement.

     Promotion Optimization Engine (POE): While Khimetrics made several attempts at measuring, modeling, and forecasting promotion in retail, this was our first at optimization. Organized as a software development rather than a client solution, and without a cooperative client, this effort was less spectacular than our markdown success. Still, we solved a difficult problem, recommending products, offers, and prices for a retail promotion. I designed, documented, and developed POE, the optimization engine for promotion planning.

     Staff Scientist: At Khimetrics I took on a role beyond software development of high-power, analytic-computing engines. There was a role for a house mathematician, somebody who could figure out which method should be done or, more often, why some simpler algorithm should not be done. This role expanded over my five years with Khimetrics and SAP/Khimetrics.

     Education and Training: When SAP bought Khimetrics and the audience for our retail science broadened to several other offices, my role as its teacher expanded. In addition to introducing new analytic people to our science, I trained developers and sales people in the general issues of Khimetrics science solved and the specific data and analytic structures we used to do it. I gave full retail-science courses in Scottsdale and India.

    

     InterContinental Hotels Group, Technical Advisor, 2000 September through 2003 September.

     Hotel Booking Simulation: I started my work in hotel revenue management the way I started my work in airline revenue management, with a simulation of the booking process. I found many issues where there was not yet a consensus among people in our revenue management group. There wasn't a clear model of demand elasticity or booking overflow when rate categories were filled. My work became more an exercise in asking questions than finding answers, very interesting.

     Rate Hierarchy Assignment: The booking process relies on grouping room rate codes by their prices. Earlier systems simply hard-coded the rate codes into levels from a table, but more-recent hotel practice had rates changing enough within rate codes that a dynamic method was needed. In searching for the simplest solution, I went through two false-start solutions. Since I had worked them through with full consensus from the revenue-management group, they were fully prepared when we had to go to something more complicated. We ended up with a sophisticated dynamic-program solution. This rate hierarchy assignment is still running today.

     Group Bookings Foundation: We were going to build some kind of group bookings revenue management system that would use transient booking levels (regular hotel people who aren't with a group or conference). I realized that meeting space and conference facilities could become central to a system managing meeting and conference revenue, perhaps even a full-blown yield management system for these shared facilities.

     New HIRO Optimizer: The outcome of the group-bookings discussions was the realization that we needed Hotel Inventory Revenue Optimizer designed specifically for hotels with group bookings as an expected addition. People expected an airline-yield-management person like me to build an airline-style yield management optimizer, but I didn't. The hotel booking environment has far more rate codes that are not necessarily in a hierarchy and a far simpler multiple-night booking network than airline itineraries. So I designed, documents, developed, and tested an optimizer build specifically for the hotel environment. This optimizer is still running today.

     HIRO value study: Since 90% of our hotels were franchised, not owned by the corporation but by individual owners, many hotels were not using the HIRO model for revenue management. As these hotels changed from non-HIRO to HIRO status, I could do an informal statistical study and found increase in revenue compared to each hotel's competitive set after they became HIRO hotels.

    

     CANAC, Inc, Principle Consultant, 1999 August through 2000 September.

     LLCM Rail Simulation: I was hired by CANAC to continue my work on the Long Line Capacity Model that they had purchased from my previous employer. I was able to model more-sophisticated train-conflict handling than before including "multiple resource fouling" where a stopped train at one junction blocks other junctions in its wake.

     Interlocking Connection Builder: A network of rail connections is called an "interlocking." This can be as simple as a siding coming into a main line of track or dozens of switches and cross-track links across three or four parallel tracks. Enumerating all the possible routes a train can take through an interlocking is essential to modeling rail network capacity. The tool I built allows a user to construct an interlocking diagram graphically and then it enumerates all the connections through it.

    

     Provar, Inc, Senior Consultant, 1997 April through 1999 July.

     Used Car Residual Value Study: This was done when our company was still called Aeon Decision and Data Solutions for a client leasing used cars for a four-year period like new-car leasing. The key to running a successful leasing business is having a better forecast of aging car values than other companies, so we were hired to do this. I wrote a regression model, not mathematically complex, but enough to outperform the blue book and other existing car valuation references.

     TPM Rail Simulation: The Train Performance Model was a client's legacy program that figured out train movement speeds and times based on speed limits, trail weights, engine power, track grades, and even wind. I recoded and documented this model so it could run as part of a larger system like LLCM.

     LLCM Rail Simulation: This Long Line Capacity Model was an extension of a much-simpler Line Capacity Model (LCM) already in use at the railroad. It only handled double track and memory limitations kept it from use in a network of realistic size or length. LLCM had a sophisticated tree-path-search heuristic for routing trains so it could evaluate the capacity of a track network to support a schedule. It was used to justify substantial investment in increasing single track to double track.

     Flood Plain Determination: Scanning 106,000 paper flood-plain maps from the Federal Emergency Management Administration (FEMA) was only the first step in automating flood plain determination for mortgage applications. The second step was designing systems to allow summer workers to geo-reference the maps that did not have latitudes or longitudes on them. My work was the third step which was to find all the shaded gray and stipple areas as polygon boundaries without getting confused by lettering, roads, and railroads. This program I designed and developed found the gray areas for all these maps.

    

     InterDigital Communications Corporation, Staff Scientist, 1995 November through 1997 April.

     CDMA Capacity Study: InterDigital is one of the technical pioneers for Code Division Multiple Access, the up-and-coming mobile telephone technology in the United States. At this time CDMA was an untested theory and nobody really knew its capacity in a dynamic cellular telephone system. There were "static" equations for steady state which I was able to extend using linear algebra and sophisticated numerical techniques ("computing tricks") to get the first handle on the dynamic capacity and blocking rate of a CDMA system.

     Power Control Simulation: The algorithms for rapid power control of a fast-moving, time-varying signal were also unknown at this time. With ultra-frequent power adjustments, we thought our technology would be able to track these changes. My simulation showed that measurement error lengthened the response time of power control considerably over our original estimates.

    

     Northwest Airlines, Senior Consultant, 1991 April through 1995 October.

    

     Airline Booking Simulation: In anticipation of new yield management technology, I was asked to write a computer simulation of airline booking from first listing of flights through day of departure. Starting with a single hub-and-spoke "complex" of inbound and outbound flights, I was able to expand the simulation to the entire Northwest Airlines network, thousands of flight legs and 100,000 connections.

     Enhanced Network Value Indexing for Yield Management: The booking simulation needed a new booking policy to test, so I came up with a leg-based "displacement cost" strategy we called ENVI which became the airline's yield management system from 1993 to 2006 where it earned about $30 million per year in incremental revenue within the constraints of the existing forecasting and reservations systems. I designed, documented, developed, and tested the simulation and ENVI.

     Delay and Cancellation Reporting System: Starting with delays for eight international cargo airplanes, my delay-tracking system grew to manage delay tracking for the entire airline. I was able to present back-tracking of delay causes in a format familiar to managers using legacy systems. My software allows the airline to calculate "allocated" delay times from root causes including their downline consequences.

     Jet Engine Reliability Study: When high engine removal rates were plaguing one of our engine types, I was asked to do a statistical regression to see which components of our watchlist were meaningful. After trying sophisticated "logit" models, I found simple linear techniques were more revealing and was able to find significant behavior differences between sub-classes of this engine type.

     Scheduling Tools: I started my scheduling-support work with a general connection builder to derive routes passengers might reasonably fly from origin to destination. That connection builder was used to bid for military transport contacts and evolved into a true connection builder with all the hub, customs, and ground-time constraints. I also built schedule-comparison and weekend-exception programs that are still in use fifteen years later.

    

    

     AT&T Bell Laboratories, Member of Technical Staff (MTS), 1982 April through 1991 March.

     Cellular Systems Engineering Analyses: Starting with a computer program called AUTOGROW that used canonical geometric frequency reuse for a cellular telephone system to grow radio capacity, my role as a computational analyst grew to a full-blown simulation of a cellular telephone radio system with all the call processing. This M2 simulation was followed by NOVA, Non-regular Optimal Voice-channel Assignment (or "Non-Optimal" as one wag called it) that used a combinatorial simulated-annealing method to get the full benefit of radio-frequency reuse and a prescriptive package PSET that would set the thousands of system parameters required. These were large-scale analytic computer program solutions to difficult problems in the new cellular technology.

     Transmission Network Design Studies: My network-design studies ranged from microwave "beam-radio" access capacity to a simulation study of a self-healing network without any central processing, a new and interesting idea at that time. I also did access capacity studies for AT&T's bidding on the Federal Telephone System FTS-2000 project and a study showing little value would be added to private line networks by a proposed packet technology for long distance.

     Airline Planning: When selling linear program as a product wasn't working, AT&T decided it made more sense to sell applications to a specific industry, in this case the airlines. I was able to simplify the solutions we were selling by using combinatorial search algorithms instead of complex math-programming solutions. I designed, documented, and developed fleet assignment and flight sequence software that was sold to two airlines. The combinatorial optimizer (COPT) assigned fleets that saved our clients about a million dollars a month and the rotation optimizer (ROPT) made their maintenance intervals far more regular for flexible planning.

    

    

ACADEMIC WORK EXPERIENCE

     Educaid Tutor, 1996 March through 1997 April.

     As a one-on-one tutor in the Educaid network I taught Mathematics, Physics, Statistics, Astronomy, and Operations Research at the high school, undergraduate, and graduate levels. Students ranged from teenagers to working students taking courses for professional growth.

     University of Minnesota, Adjunct Professor, 1994 September through 1994 December.

     Taught Introduction to Linear Programming, a Ph.D. level course, for the Operations and Management Science Department at the Carlson School of Management. I got good reports from other faculty and excellent teaching reviews from my eleven students. I wrote the course, gave the lectures, and wrote and graded the homework assignments and tests.

     Georgia Tech, Placement optimizer, 1994 May through 1995 July.

     The Universal High Speed Placer, HSP 4790, takes parts from feeder tapes and inserts them onto printed circuit boards with a pneumatic turret. Working with Georgia Tech faculty and Ford engineers, I was able to add features to their existing software, streamline its data architecture, and correct some algorithm flaws and modeling errors.

    

    

WORK EXPERIENCE (AFTER HOURS, SUMMER)

     CDMA book for McGraw-Hill, 2001 July through 2002 October.

     Writing a book is far more daunting than writing a memorandum or a white paper, the same way teaching a course is more than giving a lecture. There is a promise of completeness in a book or course, and that's difficult in a rapidly-changing environment like mobile telephony, particularly when my own experience was six years old. My book was intended to bridge a gap between technical study of Code Divsion Multiple Access and the practical issues of deploying a CDMA telephone system. The decline in the telephony-engineering job market kept book sales low, I'm afraid. The title is CDMA Capacity and Quality Optimization.

     Voice mail simulation for AT&T Bell Labs, 1992 May through 1993 April.

     A friend of mine was part of a team designing a multi-node voice-mail system for a business telephone network and key design decisions were being made based on "back of the envelope" calculations or just the intuition of designers. The team realized it needed quantitative data on the impact of these design decisions on the network, so I designed, documented, and developed a simulation of the voice-mail network so traffic loads and service delays could be quantified.

     Design Computation, Consultant, 1985 May through 1991 March.

     This was a cool job. Three of my engineering buddies and I decided to go into the printed-circuit-board-routing business. Two of them wrote a graphical editor (programmer friendly, user hostile). One of them wrote tools, built test cases, and handled technical support. I designed and developed an autorouter that would take a schematic network (called a "rat's nest" from its graphical network display) and route printed-circuit-board traces with output for manufacture. Besides doing straight "hug the traces" routing, both 45-degree and orthogonal, my router did L-shape and C-shape routes with intra-layer "vias" and used ripup-and-retry to find trace combinations. Our product was never popular, but there was a small, devout following in six continents because of its capabilities.

     The Psionic Corporation, 1980 January through 1983 July.

     After I patented my phonograph tonearm (remember vinyl records?) I decided to form my own company to manufacture it as a product. I worked with a designer, a machinist, and a variety of vendors to bring about 80 LOCI tonearms to the market, some of which are still playing records today. My audio science was good, but I had a lot to learn about hifi dealer networking and product manufacturing and production.

     Xerox PARC Analysis Research Group, 1979 June through 1979 August.

     I spent a summer working for Xerox between my first two years in graduate school. It was a fun place to work at that time because Xerox was introducing the first personal computer, the Alto, and the first graphical user interface (GUI) software. I did a study of the relationship between sales quotas and economic motivation and another study of time-varying loads on copy centers serving multiple classes of customer work. (PARC stands for Palo Alto Research Center.)

    

    

EDUCATION

1983 September, Ph.D. in Operations Research from Stanford University
1979 June, M.S. in Operations Research from Stanford University
1978 June, A.B. in Mathematics from Princeton University

    

    

PUBLICATION

Adam N. Rosenberg and Sid Kemp, CDMA Capacity and Quality Optimization, McGraw-Hill Telecom Engineering, 2002.

    

    

AWARDS

Three U. S. Patents pending from work at SAP/Khimetrics.
Four AT&T Bell Laboratories Individual Performance Awards, 1984-1991.
U. S. Patent 4,182,517 for Articulated Parallelogram Tonearm in 1980.
Graduated Cum Laude from Princeton in 1978.

    

    

COMPUTER EXPERIENCE

• Languages: extensive experience in FORTRAN and C.
• Database: SQL, Oracle, ODBC, PostgreSQL, libpq C library.
• Systems: Linux, UNIX, IBM AIX, DOS/Windows, VAX/VMS.
• Environments: UNIX shell, JCL, IBM ISPF, X-Window.