在Go中将大行扫描到结构中

问题描述:

I'm working with a database which has yet to be normalized, and this table contains records with over 40 columns.

The following is my Go code for (attempting) to scan the records into a large struct:

type Coderyte struct {
  ID int `json:"id"`
  EmrID int `json:"emr_id"`
  DftID int `json:"dft_id"`

  Fix int `json:"fix"`
  ReportDate string `json:"report_date"` // Time?
  Patient string `json:"patient"`
  ... // etc
}
func ReadCoderyte(res http.ResponseWriter, req *http.Request) {
  rows, err := db.Query("SELECT * FROM coderyte")
  if err != nil {
    http.Error(res, "Error querying database", 500)
  }
  defer rows.Close()

  // Convert rows into a slice of Coderyte structs
  coderytes := make([]*Coderyte, 0)
  for rows.Next() {
    coderyte := new(Coderyte)
    err := rows.Scan(&coderyte) // Expected 42 columns
    if err != nil {
      panic(err)
      http.Error(res, "Error converting coderyte object", 500)
    }
    coderytes = append(coderytes, coderyte)
  }

When I call this code, Scan complains that it "expected 42 destination arguments, not 1". My understanding is that I would need to address every single field in this large struct, inside of the scan call, ie Scan(&coderyte.ID, &coderyte.EmrID, etc)

My searches have only yielded this other question, where the suggested answer is to use sqlx. I'm trying to avoid using a third-party tool if I don't need it.

My question boils down to: Is there a way to convert a large database record into a struct without specifying every single field?.

I should also note that the ultimate goal of this function is to return a JSON array of objects, but I did not include that part of the code because I feel it is not important. If there is a way to bypass Scan and return JSON, that would be an appreciated answer as well.

the ultimate goal of this function is to return a JSON array of objects

It sounds like you could byass the struct entirely then, and instead scan into a map[string]interface{}, and do it all pretty dynamically: You could do something like this:

rows, _ := db.Query("SELECT * FROM coderyte") 
cols, _ := rows.Columns()
store := []map[string]interface{}
for rows.Next() {
    columns := make([]interface{}, len(cols))
    columnPointers := make([]interface{}, len(cols))
    for i, _ := range columns {
        columnPointers[i] = &columns[i]
    }

    if err := rows.Scan(columnPointers...); err != nil {
        return err
    }
    m := make(map[string]interface{})
    for i, colName := range cols {
        val := columnPointers[i].(*interface{})
        m[colName] = *val
    }
    store = append(store, m)        
}
js, _ := json.Marshal(store)
fmt.Println(string(js))

Now, obviously you could also convert it to a struct, since you could take the json and do json.Unmarshal, but given your use case that seems like a pointless extra step.

js, _ := json.Marshal(store)
structs := []Coderyte{}
json.Unmarshal(js, &structs)

All that being said, you should probably just use sqlx - they probably do way cleverer things and do it way more efficiently.