Better Coding

  • Eric M Russell (8/25/2016)


    Matt Miller (#4) (8/25/2016)


    SQLBill (8/25/2016)


    Eric M Russell (8/25/2016)


    I'm not surprised that a small group of students could take a public dataset and spin up a better application than the original government IT contractors. When dealing with non-sensitive datasets, perhaps the government should approach this from the bottom up, rather than the top down. For example, start by publishing the data and a set of high level requirements, see who can create the best prototype solution that meets the requirements, and then award them a contract to finish off the project.

    If some some bloke from Denver named Dan can create BI solutions better, faster, cheaper than a billion dollar corporation that specializes in manufactoring jet planes and electionics, then why not just hand the job over to Dan?

    Here I'm just talking about the application development itself. The government could still farm out the infrastructure and support to the usual contractors.

    Because 'Dan' is designing it on his own...not following the Government's criteria. This is where the issues really happen. The government can be really specific on what they require. Read a government contract sometime. There's been lots in the news about the cost of the toilets on the shuttle or the space station. Also, costs of military aircraft/ships. But when the requirement includes a specific description of a wrench and that description means you have to create a process to make that wrench because the $5 one you can find in the store doesn't exactly match the requirement....there go the costs up and up.

    -SQLBill

    Agreed- as a friend who was a veteran pointed out to me: you go design a toilet that can be airdropped from 500 feet out of the back of a plane without a parachute, that won't break, doesn't require extensive assembly and can run without a water or power supply, and see what price tag YOU come up with :w00t:

    Like I said, the government would publish requirements (both internal and public facing) for the web application along with the dataset. If the data is aggregated properly, then the securing the application is straightforward. The analogy of an indestructable military grade hammer doesn't pertain to a public website for visualizing census data. If someone thinks it does, then that's part of the problem.

    Look I am fairly confident that no one cares if the census web server can be dropped from 500 feet and still work....:-D

    Seriously though - designing something to simply capture data is trivial, if you don't care about securing it, session management, validation of who's filling the data in, ensuring that it actually does store the data, etc..... There's no denying that there should be public requirements and a review/bid process - but pretending like the non-functional requirements are "the easy part" is like assuming that the part of the iceberg you see is the only part. I could build an accounting system out of Microsoft access - but I wouldn't want to manage the size of Microsoft out of it....

    I wouldn't doubt that in this case - some requirements made no sense or artificially pushed up complexity or otherwise destabilized the solution. Still - I wouldn't go so far as calling it trivial.

    ----------------------------------------------------------------------------------
    Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?

  • Eric M Russell (8/25/2016)


    Federal contractors, at least within the realm of IT, farm the actual work out to sub-contractors anyhow, and I can tell you that those sub-contractors are no more skilled than the rank and file IT folks you would find in corporate America or fresh university graduates.

    Says something about the state of our industry.

    One reason we've pushed the SQL Saturdays and training more people is to raise the bar where we can.

  • Steve Jones - SSC Editor (8/25/2016)


    Eric M Russell (8/25/2016)


    Federal contractors, at least within the realm of IT, farm the actual work out to sub-contractors anyhow, and I can tell you that those sub-contractors are no more skilled than the rank and file IT folks you would find in corporate America or fresh university graduates.

    Says something about the state of our industry.

    One reason we've pushed the SQL Saturdays and training more people is to raise the bar where we can.

    It's the state of every industry though, no profession anywhere has people of equal skill. Even when you start looking at things like doctors some are good and some are not.

  • Matt Miller (#4) (8/25/2016)


    Eric M Russell (8/25/2016)


    Matt Miller (#4) (8/25/2016)


    SQLBill (8/25/2016)


    Eric M Russell (8/25/2016)


    I'm not surprised that a small group of students could take a public dataset and spin up a better application than the original government IT contractors. When dealing with non-sensitive datasets, perhaps the government should approach this from the bottom up, rather than the top down. For example, start by publishing the data and a set of high level requirements, see who can create the best prototype solution that meets the requirements, and then award them a contract to finish off the project.

    If some some bloke from Denver named Dan can create BI solutions better, faster, cheaper than a billion dollar corporation that specializes in manufactoring jet planes and electionics, then why not just hand the job over to Dan?

    Here I'm just talking about the application development itself. The government could still farm out the infrastructure and support to the usual contractors.

    Because 'Dan' is designing it on his own...not following the Government's criteria. This is where the issues really happen. The government can be really specific on what they require. Read a government contract sometime. There's been lots in the news about the cost of the toilets on the shuttle or the space station. Also, costs of military aircraft/ships. But when the requirement includes a specific description of a wrench and that description means you have to create a process to make that wrench because the $5 one you can find in the store doesn't exactly match the requirement....there go the costs up and up.

    -SQLBill

    Agreed- as a friend who was a veteran pointed out to me: you go design a toilet that can be airdropped from 500 feet out of the back of a plane without a parachute, that won't break, doesn't require extensive assembly and can run without a water or power supply, and see what price tag YOU come up with :w00t:

    Like I said, the government would publish requirements (both internal and public facing) for the web application along with the dataset. If the data is aggregated properly, then the securing the application is straightforward. The analogy of an indestructable military grade hammer doesn't pertain to a public website for visualizing census data. If someone thinks it does, then that's part of the problem.

    Look I am fairly confident that no one cares if the census web server can be dropped from 500 feet and still work....:-D

    Seriously though - designing something to simply capture data is trivial, if you don't care about securing it, session management, validation of who's filling the data in, ensuring that it actually does store the data, etc..... There's no denying that there should be public requirements and a review/bid process - but pretending like the non-functional requirements are "the easy part" is like assuming that the part of the iceberg you see is the only part. I could build an accounting system out of Microsoft access - but I wouldn't want to manage the size of Microsoft out of it....

    I wouldn't doubt that in this case - some requirements made no sense or artificially pushed up complexity or otherwise destabilized the solution. Still - I wouldn't go so far as calling it trivial.

    My point was that the client's requirements could cause a 're-tooling' instead of an 'off the shelf solution'. Re-tooling or redesigning will always be more expensive.

    There are petitions, letters, etc. to the U.S Census Bureau asking them to increase the Gender/Sex option. Currently it is Male/Female. When you are creating an application/backend/process that needs Gender/Sex information, you can probably find an existing application/backend/process to use and convert/update to do what you need. But what if you have to include different gender/sex options? Most likely, it will cause a major re-write of an existing process or the creation of something entirely new. Where a company could save money by repurposing an existing application/backend/process if two gender/sex option is needed, it could be more expensive to make an existing one meet the criteria of multiple gender/sex options.

    So, while the students came up with something less expensive...did it meet all the requirements of the client? Did the expensive one cost so much because of the client requirements?

    -SQLBill

  • In the UK there are restrictions to do with company financial state and length of operation that prevent smaller players from being considered for government contracts.

    When the contracts come in they tend to be lucrative but the process of applying for and winning the contract is lengthy and expensive. It favours the larger monolith vendors.

    I know of people in the civil service who are frustrated that their department has a technical specialism that is rare in the private sector but they find a significant portion of their work being outsourced and done to a lower standard than they could do with their own resources. To add insult to injury the stuff that is outsourced is the simple and cheap stuff, not the onerous stuff which still has to be done in order for the simple stuff to be outsourced in the first place. The outsourcer requires substantial supervision as the work has to be of a certain standard.

    As the cheap stuff is 40% of the work volume the financial budget and staff are cut by 40%. However the cheap stuff represents 5% of the costs, not 40%.

    The upshot is that the remaining staff are having to do more work than before both to support the outsource resource and also to cover the cut in their colleagues.

    They then face criticisms for an increased backlog. "40% of the work has been outsourced, how can you still have a backlog"?

    Discussions with large monolith vendors takes place at a level of the organisation far removed from the coal face where the day-to-day challenges are at best understood in abstract and at a summary level.

    The problem is as muhammad ali said, it's not the size of the mountain that defeats you, it's the stone in your shoe

  • David.Poole (8/25/2016)


    For a census the app simply collects the post from a web form and persists it.

    Given that it is a once every 10 year thing with fixed questions you are not far from having a write only app. Maybe you need something to allow people to go back to previous pages but beyond that I can't see the benefit in providing any application data retrieval...

    I'm inclined to agree with David Poole. Collecting Census data has got many of the attributes of a "write only" or at least "write once" application. Once the form has been submitted the person filling it in has no need to access it again. In fact, from that point on the data is really only used as aggregates.

    In Ireland they still distribute and collect the forms manually. It is expensive but it has benefits. This year I participated in a census as an "enumerator" - that's the guy who distributes the census form, collects it and helps with any questions people have. I did it because it was an opportunity to get out from in front of a computer screen! I looked at Steve's link to the US form, and it is pretty similar but the Irish one seems to have more classification data. My experiences may be of some interest for this discussion.

    My experience of the form and the process I was following had the marks of "punched cards" and processing batches. That's not necessarily wrong but it was noticeable. I noticed that the problems were not so much systems as people related. At least some of my job was to persuade people to fill the form in at all! Another part was to help them fill it in correctly, and detect minor silly omissions - the so-called "doorstep check". And at least some of what I did involved dealing with edge-cases: people who were somewhere else on the night or questions like "do I tick this box or that box?"

    For census data to be useful you want to get as close to 100% coverage as you can. That was where I came in (though I didn't realise it at first). There are some groups which tend to exclude themselves and you want to include them in the dataset. People like: foreigners, the illiterate, the elderly... you can add your own groups to the list.

    From a systems point of view, a census is peculiar too. Would you like to be running a system which you got out of the box and ran once every 5 or 10 years? On the other hand would you like to develop it fresh every 10 years? Even though the data is so simple, systems have moved quite a bit in 5 years!

    In conclusion: I think census systems have real potential for some kind of "package" solution. The core idea is dead simple. Every country runs them so there is a market. The issues are the fact they are run infrequently in any particular place, the problems around promoting completeness and accuracy and the political problems relating to privacy and security. It's a good one to puzzle over! 🙂

    Tom Gillies LinkedIn Profilewww.DuhallowGreyGeek.com[/url]

Viewing 6 posts - 16 through 20 (of 20 total)

You must be logged in to reply to this topic. Login to reply